Hi, i have a log file i'm trying to read into fluentd - eventually to push into elasticsearch. The file is in csv format and has a nanosecond timestamp field. Ideally I'd like to use this as the time field in fluentd. I'm using the in_tail plugin to read the file and i'm setting the time field via the "time_key" parameter.
In Ruby, i can print nanosecond resolution using the timeformat %s%9N - but fluentd errors out on this. It seems although strftime will handle '%9N', strptime doesn't recognize it.
Has anyone had any success doing this ?
Also, if i can't do this, then i would like to drop a bunch of digits and create another field with say millisecond time stamps and use that as the time field. It seems one way to do this is to use the filter_record_transformer plugin to create a new field with the truncated value and use the "renew_time_key" parameter to overwrite the timestamp.
But my question is - if i was to do that, do i still need to specify time_key in my in_tail plugin. Because that just crashes on the nanosecond timestamp and as i understand it, filters are processed after the input plugins right ?
Thanks
Clive