[s3 plugin] Not generating buffer file and send to s3

42 views
Skip to first unread message

Genie Chon

unread,
Mar 30, 2021, 4:51:06 AM3/30/21
to Fluentd Google Group
Hello, 

I am trying to run fluentd to send log from another docker container.
Here is what I have done for the process.

Below is Dockerfile

FROM fluent/fluentd@sha256:9c11b5c2d....(latest debian)

USER root

RUN gem install -N fluent-plugin-s3 --no-document

COPY fluent.conf /fluentd/etc/fluent.conf

USER fluent


Below is fluent.conf

<source>

  @type forward

  @id input1

  @label @mainstream

  port 24224

</source>


<label @mainstream>

<match **>

@type stdout

</match>


<match **>

@type s3

@id output_s3


aws_key_id XXXX

aws_sec_keyXXXXX

s3_bucket bucketname-for-test

s3_region ap-northeast-2

s3_object_key_format %{path}%{time_slice}%{uuid_flush}_%{index}.%{file_extension}

path logs/%Y/%m/%d/rke/${docker_hostname}/${log_type}/${log_type}


<buffer tag,time>

@type file

path /fluentd/s3_buffer 

timekey 60 # 1 min partition

timekey_wait 1m

timekey_use_utc true # use utc

chunk_limit_size 256m

flush_mode interval

flush_interval 10s 

</buffer>

</match>

</label>


When I build this image, and sent below call:

docker run --log-driver=fluentd --log-opt fluentd-address=172.17.0.2:24224 --log-opt tag={{.ID}} ubuntu echo "Hello Fluentd"

It does create stdout in the fluentd, but does not generate file for the s3 buffer. When I used @type file to create a file, it DOES created file that contains log, but whenever I tried to use s3 plugin, it does not work. 

Can somebody review my Dockerfile and conf and let me know if there is anything that I need to do?


Also, I did add three bucket/bucket object policy

"s3:PutObject",
"s3:GetObject"

for bucket/*

"s3:ListBucket"

for bucket

Reply all
Reply to author
Forward
0 new messages