Enter code here...
Hi all,
I know that you can use parse or flatten hash plugin to parse a while JSON message, how about this format from log driver fluentd?
2016-01-28 20:04:23 +0000 docker.4643317973d5: {"container_id":"4643317973d56010ecb5990680a40e59be523f04a456686fa565430574e03245","container_name":"/clever_wilson","source":"stdout","log":"{\"key\":\"value\"}","@log_name":"docker.4643317973d5"}
Elasticsearch plugin output looks like, the log field is being treated as string, what I want is to have key being indexed from log body.
{
"_index": "logstash-2016.01.28",
"_type": "fluentd",
"_id": "AVKJ2JABH3dkBxm_sj8h",
"_score": null,
"_source": {
"container_name": "/suspicious_mccarthy",
"source": "stdout",
"log": "\"test\":{\"key\":\"value\"}",
"container_id": "f4a85be832aef70ae4f06258ab7394683a93fdfab704556c68cc976df09d531e",
"@log_name": "docker.f4a85be832ae",
"@timestamp": "2016-01-28T20:06:46+00:00"
},
"fields": {
"@timestamp": [
1454011606000
]
},
"sort": [
1454011606000
]
}