loading aws events/logs

138 views
Skip to first unread message

Jason Stelzer

unread,
Sep 7, 2016, 3:06:34 PM9/7/16
to wa...@googlegroups.com
Hey there. Not long ago I opened a PR that let you preserve your aws log history for cloudtrail. So here I am loading a fairly large amount of historic logs. It seems that the timestamps for aws events are wrong.

By that I mean that it seems like the timestamp of when the event is loaded is prefered over the eventTime attribute in the full_log json. That's problematic because in the first place, cloudtrail logs are not exactly the most timely things in the world. So almost by definition you're going to have latency of around 10-15 minutes. But it gets truly insane when loading a couple years worth of logs all at once.

Now, this could be a config issue on my side. I'm telling ossec to load the aws log via:
  <localfile>
    <log_format>syslog</log_format>
    <location>/var/ossec/logs/amazon.log</location>
  </localfile>


Should I be using a different log_format? I imagine declaring a custom log format and overriding the timestamp with the aws payload is one of the few ways to solve this. Am I on the right track? Does one exist or is this another PR waiting to happen?

Granted, I'm not super worried about this because once I am caught up, moving forward things should be roughly in line with reality.

--
J.

Jesus Linares

unread,
Sep 9, 2016, 5:51:11 AM9/9/16
to Wazuh mailing list
Hi Jason,

The timestamp used by OSSEC correspond to the alert timestamp (when the alert was triggered) and not the real timestamp (when the log was generated). I think this is a good behaviour in order to prevent errors, if OSSEC would use the real timestamp:
  • It should be able to read the date included in each log, and the timezone too.
  • Alerts rotation (.../alerts/year/month/day.gz) would be very difficult to archive. For example, if OSSEC generates the .gz for the previous day, and the next day it receives a log with a date of "yesterday", should it add the log to the previous .gz?.
  • Alert visualisation in kibana would be difficult. If you have logs with different timezones, and you select "Last 15 minutes" you will see only alerts if they match with your timezone.
In order to simplify the timezone problems, OSSEC uses the timezone of the host where it is installed and the timestamp of the alert. So, we can say that OSSEC is designed to receive alerts from the present and not old logs.

Having said that, you have a possible workaround in case you are using ELK:
  • Modify the amazon decoder to extract the date in a field. So, you will have the real date in alerts.json.
  • Modify the logstash-forwarder configuration to do something like: "if it is a amazon log: use real_date field as timestamp; else: use the usual timestamp field"
  • In this way, amazon logs will be inserted in elastic with the real date instead of the "OSSEC date".
I hope it helps.
Regards.

Jason Stelzer

unread,
Sep 12, 2016, 1:25:09 PM9/12/16
to Jesus Linares, Wazuh mailing list
Thanks for getting back to me.

I've looked into it and I've pretty much arrived at the same conclusion. It's not essential for what i'm doing and it would be a huge effort to do correctly. I'm going to just accept the fact that the historic data will be noisy and move on.



--
You received this message because you are subscribed to the Google Groups "Wazuh mailing list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+unsubscribe@googlegroups.com.
To post to this group, send email to wa...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/wazuh/9eb325f8-2695-4632-a08f-abfd3ca91256%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
J.

Reply all
Reply to author
Forward
0 new messages