Parse and extract docker nested JSON logs with fluentd

7,898 views
Skip to first unread message

Дмитрий Ансимов

unread,
Jun 7, 2018, 3:20:55 AM6/7/18
to Fluentd Google Group
Hi folks, need your kindly help.

Can I somehow extract the nested JSON Java log out from docker JSON-formatted log string (log filed) to send it to the elasticsearch as a JSON object, not as a string?
Here's the log string example:

{"log":"{\"timeMillis\":1528281691581,\"thread\":\"main\",\"level\":\"INFO\",\"loggerName\":\"com.data.MultiThreadWorker\",\"message\":\"Waiting queue...\",\"endOfBatch\":false,\"loggerFqcn\":\"org.apache.logging.log4j.spi.AbstractLogger\"}\r\n","stream":"stdout","time":"2018-06-06T10:41:31.582085717Z"}


So it can be sent to ELK as:

{"timeMillis":1528281691581,"thread":"main","level":"INFO","loggerName":"com.data.MultiThreadWorker","message":"Waiting queue...","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","stream":"stdout","time":"2018-06-06T10:41:31.582085717Z"}

Great thank in advance for any kind of help.

Mr. Fiber

unread,
Jun 7, 2018, 3:54:59 AM6/7/18
to Fluentd Google Group

--
You received this message because you are subscribed to the Google Groups "Fluentd Google Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Дмитрий Ансимов

unread,
Jun 7, 2018, 5:35:33 AM6/7/18
to Fluentd Google Group
Is this config correct? 

    <source>
      @type tail
      @log_level error
      path /var/log/containers/*backend*.log
      tag kubernetes.*
      @type json
      time_key time
      keep_time_key true
      refresh_interval 5
   
</source>


   
<filter key.log>
      @type parser
      format json      
     
key_name log
      reserve_data true

   
</filter>


   
<filter key.remove>
      @type record_modifier
      remove_keys log,timeMillis
   
</filter>

I'm using docker image as this:

FROM fluent/fluentd-kubernetes-daemonset:v1.2-debian-elasticsearch


RUN
\
  fluent
-gem install fluent-plugin-detect-exceptions && \
  fluent
-gem install fluent-plugin-concat && \
  fluent
-gem install fluent-plugin-kubernetes_metadata_filter && \
  fluent
-gem install fluent-plugin-multi-format-parser && \
  fluent
-gem install fluent-plugin-record-modifier

and it prints me an error 
Unknown input plugin 'json'



четверг, 7 июня 2018 г., 10:20:55 UTC+3 пользователь Дмитрий Ансимов написал:

Дмитрий Ансимов

unread,
Jun 7, 2018, 5:52:07 AM6/7/18
to Fluentd Google Group
Excuse me, there was an error in the source section. Next config gives an error: 
error="parse/@type is required."


    <source>

      @type tail
      @log_level error
      path /var/log/containers/*backend*.log
      tag kubernetes.*
      time_key time
      keep_time_key true
      refresh_interval 5
   
</source>


   
<filter key.log>
      @type parser
      key_name log
      reserve_data true
     
<parse>
        @type json
     
</parse>

   
</filter>


   
<filter key.remove>
      @type record_modifier
      remove_keys log,timeMillis
   
</filter>


четверг, 7 июня 2018 г., 10:20:55 UTC+3 пользователь Дмитрий Ансимов написал:
Hi folks, need your kindly help.

Mr. Fiber

unread,
Jun 8, 2018, 2:44:56 AM6/8/18
to Fluentd Google Group
in_tail requires `<parse>` section.

--
Reply all
Reply to author
Forward
0 new messages