Hi,
I'm using td-agent w/ fluent-plugin-elasticsearch to send to an AWS Elasticsearch. I am using a t2.micro.elasticsearch instance type and it has a 10MB limit on the maximum HTTP request payload size.
I've tried setting the buffer chunk limit to smaller values thinking that it might control the size of the HTTP data payload being sent to Elasticsearch when it is flushed. However, I still get the following error.
2016-07-29 10:52:56 -0400 [warn]: temporarily failed to flush the buffer. next_retry=2016-07-29 10:53:28 -0400 error_class="Elasticsearch::Transport::Transport::Errors::RequestEntityTooLarge" error="[413] {\"Message\":\"Request size exceeded 10485760 bytes\"}" plugin_id="object:3f9ba418f608"
I think I am hitting the 10 MB payload limit of the AWS instance.
Here's a snippet of my elasticsearch plugin configuration:
<store>
@type elasticsearch
port 80
logstash_format true
include_tag_key true
tag_key _key
log_level debug
# Buffering
buffer_type file
buffer_path "/opt/td-agent/buffer/"
flush_interval 60s
reload_on_failure true
max_retry_wait 300s
retry_wait 15s
disable_retry_limit
buffer_chunk_limit 300k
buffer_queue_limit 4096
</store>
Any help would be appreciated. Thanks