I used fluentd to collect some logs which are json format. the fluentd version is td-agent-3.7.1-0.el7.x86_64.rpm .
<source>
@type tail
path /app/logs/spn.json
pos_file /app/td-agent/pos/pos.log
tag spn-log
<parse>
@type json
time_key @timestamp
time_type string
time_format %Y-%m-%dT%H:%M:%S.%N%z
</parse>
</source>
<filter spn-*>
@type grep
<regexp>
key class
pattern /(^com\.caih\.spn\.full\.log$|^com\.caih\.spn\.bill$|^com\.caih\.spn\.mtwm\.call\.bill$|^com\.caih\.spn\.mtwm\.sms\.bill$|^com\.caih\.spn\.mtwm\.log$|^com\.caih\.spn\.vtt\.log$|^com\.caih\.spn\.lianjia\.log$|^com\.caih\.spn\.lianjia\.call\.bill$|^com\.caih\.spn\.ext\.log$|^com\.caih\.spn\.sms$|^com\.caih\.spn\.retrypush\.log$)/
</regexp>
</filter>
<match spn-*>
@type rewrite_tag_filter
<rule>
key class
pattern /^com\.caih\.spn\.full\.log$/
tag caih-spn-full-log
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.bill$/
tag caih-spn-bill
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.mtwm\.call\.bill$/
tag caih-spn-mtwm-call-bill
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.mtwm\.sms\.bill$/
tag caih-spn-mtwm-sms-bill
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.mtwm\.log$/
tag caih-spn-mtwm-log
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.vtt\.log$/
tag caih-spn-vtt-log
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.lianjia\.log$/
tag caih-spn-lianjia-log
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.lianjia\.call\.bill$/
tag caih-spn-lianjia-call-bill
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.ext\.log$/
tag caih-spn-ext-log
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.sms$/
tag caih-spn-sms-log
</rule>
<rule>
key class
pattern /^com\.caih\.spn\.retrypush\.log$/
tag caih-spn-retrypush-log
</rule>
</match>
<filter caih-*>
@type parser
key_name rest
reserve_time true
reserve_data true
remove_key_name_field true
<parse>
@type json
</parse>
</filter>
<filter caih-*>
@type record_transformer
enable_ruby true
<record>
hostname "#{Socket.gethostname}"
</record>
</filter>
<match caih-*>
@type forward
@log_level info
<server>
host 10.19.1.51
port 24224
</server>
<buffer tag,time>
@type file
path /app/td-agent/buffer/spn-test
timekey 60
timekey_wait 0s
timekey_use_utc false
flush_mode interval
flush_interval 5s
chunk_limit_size 200M
</buffer>
</match>
The logs were send to elasticsearch for search. When i search the logs on elasticsearch, I found some logs is missing . I use Prometheus to monitor the log forwarders, I found the input_status_num_records_total is less than the number of logs in the log file. S o what can I do to fix this problem? thanks.