Hello
The following is the log message which I have ( I am not sure why '\\\' showing in the log.
{"message":"{\"log\":\"2016-12-06 15:12:10,690|http-nio-8080-exec-25|INFO|namespace:com-example|com.audit|10.233.87.12|monitordemo-o9jwy|monitordemo| {\\\"nodeName\\\":\\\"nodeName\\\",\\\"applicationId\\\":\\\"User\\\",\\\"uniqueTransactionId\\\":\\\"468be312-bf15-428f-869a-2e754079cee6\\\",\\\"transactionName\\\":\\\"service.hello\\\",\\\"transactionStatus\\\":\\\"C\\\",\\\"responseCode\\\":\\\"200\\\",\\\"responseDescription\\\":\\\"OK\\\",\\\"endTimestamp\\\":\\\"2016-12-06 15:12:10.687\\\",\\\"initiatedTimestamp\\\":\\\"2016-12-06 15:12:10.655\\\",\\\"elapsedTime\\\":\\\"32\\\",\\\"clientIp\\\":\\\"10.233.115.0\\\",\\\"cluster\\\":\\\"cluster\\\",\\\"httpMethod\\\":\\\"GET\\\",\\\"requestURL\\\":\\\"http://10.112:8080/demo/service/hello\\\"}\\n\",\"stream\":\"stdout\",\"time\":\"2016-12-06T15:12:10.693921067Z\"}"
I used the below code to spilt the message for mapping in kibana 4
<filter k9.**>
@type record_modifier
enable_ruby yes
auto_typecast yes
<record>
logEventTimestamp ${record["message"].split('|')[0]}
threadId ${record["message"].split('|')[1]}
logLevel ${record["message"].split('|')[2]}
namespace ${record["message"].split('|')[3]}
logType ${record["message"].split('|')[4]}
serverIpAddress ${record["message"].split('|')[5]}
serverName ${record["message"].split('|')[6]}
podServiceName ${record["message"].split('|')[7]}
#logrecord_json ${record["message"].split('|')[8]}
logrecord_json ${record["message"].split('|')[8].delete! '\\\\'}
</record>
</filter>
logrecord_json is looks as below.
logrecord_json{"nodeName":"sampleNode","applicationId”:”testuser”,”uniqueTransactionId":"468be312-bf15-428f-869a-2e754079cee6","transactionName":"service.hello","transactionStatus":"C"
I have tried the below code and its worked in ES5 and kibana 5 . but now I need to use kibana4 and the following parser is not working
<filter k9.**>
@type parser
format csv
key_name logrecord_json
reserve_data true
#time_parse no
#hash_value_field logrecord_json
#hash_value_field parsed
</filter>
By using 'logrecord_json' how can I map the json data like in kibana
nodeName= sampleNode
applicationId= testuser
uniqueTransactionId=468be312-bf15-428f-869a-2e754079cee6
I am not an expert on regex if anyone please suggest a solution , I have tried a lot :(.
logrecord_json{"nodeName":"sampleNode","applicationId”:”testuser”,”uniqueTransactionId":"468be312-bf15-428f-869a-2e754079cee6","transactionName":"service.hello","transactionStatus":"C"}
port=>9200, :scheme=>"http"}
2016-12-06T15:13:56.543180040Z 2016-12-06 15:13:56 +0000 [warn]: dump an error event: error_class=Fluent::Plugin::Parser::ParserError error="pattern not match with data ' {\"nodeName\”:\”sampleName\",\"applicationId\”:\”testUser\",\"uniqueTransactionId\":\"ee28bfc1-9afd-424f-9a20-c84f603512c1\",\"transactionName\":\"service.hello\",\"transactionStatus\":\"C\",\"responseCode\":\"200\",\"responseDescription\":\"OK\",\"endTimestamp\":\"2016-12-06 15:13:55.546\",\"initiatedTimestamp\":\"2016-12-06 15:13:55.505\",\"elapsedTime\":\"41\",\"clientIp\":\"10.233.115.0\",\"cluster\":\"cluster\",\"httpMethod\":\"GET\",\"requestURL\":\"http://10.22.281.27:8080/demo/service/hello\"}n\",\"stream\":\"stdout\",\"time\":\"2016-12-06T15:13:55.550938285Z\"}'" tag="k9.var.log.containers.monitordemo.log" time=#<Fluent::EventTime:0x000000017e6408 @sec=1481037236, @nsec=540396904>
--
You received this message because you are subscribed to the Google Groups "Fluentd Google Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
<filter k9.**>
@type record_modifier
<record>
logEventTimestamp ${record["message"].split('|')
[0]}
threadId ${record["message"].split('|'
)[1]}
logLevel ${record["message"].split('|')
[2]}
namespace ${record["message"].split('|')
[3]}
logType ${record["message"].split('|')
[4]}
serverIpAddress ${record["message"].split('|')
[5]}
serverName ${record["message"].split('|')
[6]}
podServiceName ${record["message"].split('|')
[7]}
logrecord_json ${record["message"].split('|')
[8]}
</record>
</filter>
<filter k9.**>
@type parser
format json
key_name logrecord_json
reserve_data true
</filter>
I was using above filters and everything mapped correctly in Kibana 5 & ES5 but in not in kibana 4
I am not sure what configuration changes I needs to make, sorry I am new to this.
After splitting the record 'logrecord_json ' field has josn data. then using the 'parser' filter is not automatically parsing it.
how can I make this json like string to key value pairs for kibana 4.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+u...@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+unsubscribe@googlegroups.com.