I am using fluentd on mac and am trying to use a field's value from input source in the logstash-prefix for elasticsearch. However, it retrieves it as a plain text and doesnt replace with actual value.
Pls help with suggestions.
Here is td-agent.conf :
<source> @type tail path /data/fluentd_all_logs/myfile.log pos_file /var/log/td-agent/myfile.pos tag app.json format json </source> <filter app.json> @type record_transformer enable_ruby true <record> cId app1.${metadata["clientId"]} </record> </filter> <match java-app.json> @type elasticsearch logstash_format true logstash_prefix app1.${cId} logstash_dateformat %Y.%m host elasticsearch port 9200 type_name logs </match> //index generated in elasticsearch is : $ curl -XGET "elasticsearch:9200/_cat/indices?v" health status index pri rep docs.count docs.deleted store.size pri.store.size yellow open app1.${cid}-2016.08 5 1 1 0 9.6kb
Expectation : is that my index should look like
app1.mohit-2016.08 where mohit is the clientId
--
You received this message because you are subscribed to the Google Groups "Fluentd Google Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
<match my.logs> @type elasticsearch index_name elastic.${key1}.${key2} # => e.g.) elastic.value1.value2 <buffer tag, key1, key2> @type memory </buffer> # <snip>
v0.14 placeholders can handle ${tag}
for tag, %Y%m%d
like strftime format, and custom record keys like as record["mykey"]
.
Note that custom chunk key is different notations for record_reformer
and record_modifier
. They uses record["some_key"]
to specify placeholders, but this feature uses ${key1}
, ${key2}
notation. And tag, time, and some arbitrary keys must be included in buffer directive attributes.