using nanoseconds since epoch as time field

347 views
Skip to first unread message

Clive Saha

unread,
Mar 8, 2016, 3:24:12 PM3/8/16
to Fluentd Google Group
Hi, i have a log file i'm trying to read into fluentd - eventually to push into elasticsearch. The file is in csv format and has a nanosecond timestamp field. Ideally I'd like to use this as the time field in fluentd. I'm using the in_tail plugin to read the file and i'm setting the time field via the "time_key" parameter.

In Ruby, i can print nanosecond resolution using the timeformat %s%9N - but fluentd errors out on this. It seems although strftime will handle '%9N', strptime doesn't recognize it.

Has anyone had any success doing this ?

Also, if i can't do this, then i would like to drop a bunch of digits and create another field with say millisecond time stamps and use that as the time field. It seems one way to do this is to use the filter_record_transformer plugin to create a new field with the truncated value and use the "renew_time_key" parameter to overwrite the timestamp.

But my question is - if i was to do that, do i still need to specify time_key in my in_tail plugin. Because that just crashes on the nanosecond timestamp and as i understand it, filters are processed after the input plugins right ?

Thanks
Clive

Mr. Fiber

unread,
Mar 9, 2016, 1:39:15 AM3/9/16
to Fluentd Google Group
This issue is similar to your case: https://github.com/fluent/fluentd/issues/679
Please try %N.

For keeping nanosecond resolution without using v0.14.0.pre1,
keep_time_key option may help.


--
You received this message because you are subscribed to the Google Groups "Fluentd Google Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages