Set size of payload send to AWS Elasticsearch

1,167 views
Skip to first unread message

TomG

unread,
Nov 21, 2018, 4:24:42 AM11/21/18
to Fluentd Google Group
Hi,

I have issues to send data to AWS Elasticsearch because of 10MB limit we have with the instance type. So here is the section of the config file used:

<match test.**>
  @type elasticsearch
  @log_level warn
  time_key timestamp
  time_key_format %Y-%m-%dT%H:%M:%S.%N%z
  logstash_format true
  logstash_prefix testindex
  logstash_dateformat %Y-%m-%d
  port 443
  scheme https
  ssl_verify false
  ssl_version TLSv1_2

  buffer_type file
  buffer_path /var/log/td-agent/test.buffer
  buffer_queue_limit 1024
  chunk_size_limit 1024
  flush_at_shutdown true
  flush_interval 1s

#  slow_flush_log_threshold 40.0
  num_threads 2
  @id test_buffer
</match>

AWS Elasticsearch returns error:

failed to flush the buffer. retry_time=0 next_retry_seconds=2018-11-21 09:12:17 +0000 chunk="57b292554695169bed0f30bc241d6172" error_class=Elasticsearch::Transport::Transport::Errors::RequestEntityTooLarge error="[413] {\"Message\":\"Request size exceeded 10485760 bytes\"}"

How can I limit the size of data send to the Elasticsearch instance? Is chunk_size_limit the right setting to use? All my tries ended up with the same error (...Request size exceeded...).

Help would be appreciated!

Tom

Mr. Fiber

unread,
Nov 21, 2018, 7:03:00 AM11/21/18
to flu...@googlegroups.com

--
You received this message because you are subscribed to the Google Groups "Fluentd Google Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages