I've tried looking for help on this topic in the mailing list before sending it out to the group but no luck. I am trying to use fluentd to ingest multiple csv(logs) files that are dropped in a particular directory once a day. Once ingested they need to send that to elasticsearch. I've tried using the tail_ex plugin with the wildcard setting but it doesn't work since the files are only not written too, but dropped/moved in to this directory by a third party proprietary software once a day.. Is there any way for fluentd to read contents of all new files and send them to elasticsearch? Here's my failed attempt using tail_ex.
<source>
type tail_ex
tag message
format csv
time_format %Y-%m-%d %H:%M:%S%z
path /archived_logs/xxxxxxx/xxxx_xxxxxxx3-%Y-%m-%d-*.csb
keys key1,key2,key3,key4,key5,key6,key7,key8,key9,key10,key11,key12,key13,key14,key15,key16,key17,key18,key19,key20,key21,key22,key23,key24
time_key key3
#path /var/log/jetty-*/%Y_%m_%d.stderrout.log
pos_file /var/log/td-agent/xxxxx.log.pos
refresh_interval 60
</source>
## match tag=debug.** and dump to console
<match debug.**>
type stdout
</match>
<match **>
type elasticsearch
logstash_format true
host elastic-host
port 9200
index_name maillogs
type_name maillogs
</match>