SSL error attempting to use s3 plugin to openstack ceph

567 views
Skip to first unread message

Christian Hedegaard

unread,
Apr 29, 2015, 3:55:44 PM4/29/15
to flu...@googlegroups.com

Hey all.. We’re deploying a new site inside openstack using radosgw and ceph as our “S3” backend.

 

I’ve been working through some difficulties getting the fluentd s3 output plugin working. Initially I was attempting an ip/port combo for “s3_endpoint”, but realized that the code is trying to use dns and do “bucket.s3_endpoint.com”. So I got wildcard DNS up and running so we could handle that (though, honestly, I’d still just prefer us being able to specify an IP).

 

Now I’m getting SSL errors:

2015-04-29 19:48:23 +0000 [error]: failed to configure/start sub output s3: can't call S3 API. Please check your aws_key_id / aws_sec_key or s3_region configuration. error = #<OpenSSL::SSL::SSLError: SSL_connect returned=1 errno=0 state=SSLv2/v3 read server hello A: unknown protocol>

 

I know the aws keys are valid as I’ve got s3cmd (pip install s3cmd) running and able to list buckets using that pair. It also takes a host/port combo for talking to S3 and it works fine.

 

A little help would be appreciated.

 

We CAN attempt to do this using the fluentd swift plugin (radosgw will emulate swift), but would prefer not to re-implement all the work that I’ve already done using fluentd -> s3.

 

Here is the config (sanitized):
<match instrument.data.s3.**>

  type forest

  subtype s3

  <template>

    aws_key_id <redacted>

    aws_sec_key <redacted>

    s3_bucket bucket_name

    s3_object_key_format %{path}region-${tag}-guid-%{time_slice}_%{index}.log.%{file_extension}

    s3_endpoint s3.something.com:8080  <- Yes, we need that port number in there

    path metric_logs/${tag}/

    buffer_path /data/logging/s3_buffer/${tag}

    time_slice_format %Y-%m-%d-%H

    time_slice_wait 10m

    utc

    format json

  </template>

</match>

 

 

Christian Hedegaard-Schou I
Red 5 Studios - Technical Operations
Email: 
chede...@red5studios.com

 

Mr. Fiber

unread,
Apr 29, 2015, 4:10:32 PM4/29/15
to flu...@googlegroups.com
Hmm...

At first, does included s3 gem work or not?
For example, /opt/td-agent/embedded/bin/ruby test_s3_gem.rb # write simple upload script

And does your OpenStack environment has own certificate?
If so, set environment variable in /etc/sysconfig/td-agent on CentOS or
/etc/default/td-agent on Ubuntu may help.

export SSL_CERT_FILE=/etc/pki/tls/cert.pem

In td-agent, ruby libraries use td-agent's certificate.


Masahiro

--
You received this message because you are subscribed to the Google Groups "Fluentd Google Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fluentd+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Christian Hedegaard

unread,
Apr 29, 2015, 4:20:24 PM4/29/15
to flu...@googlegroups.com

I’m a systems engineer, not a developer, and I really really don’t like Ruby J Do you have a sample script for testing the s3 gem?

 

We do not, as far as I know, have our own cert for our environment. I will ask the right people about that one though.

Mr. Fiber

unread,
Apr 29, 2015, 6:43:32 PM4/29/15
to flu...@googlegroups.com
Test script is here:

require 'aws-sdk-v1'

options = {}
options[:access_key_id] = ENV['AWS_ACCESS_KEY_ID']
options[:secret_access_key] = ENV['AWS_SECRET_ACCESS_KEY']
#options[:s3_endpoint] = # your endpoint here
options[:use_ssl] = true

s3 = AWS::S3.new(options)
bucket = s3.buckets['test_bucket']
bucket.objects['gem_test'].write('test', :content_type => 'text/plain')

You should fill s3_endpoint.

BTW, if you ok, set false to use_ssl is an another approach in inner-connection.

Christian Hedegaard

unread,
Apr 29, 2015, 7:33:15 PM4/29/15
to flu...@googlegroups.com

Yeah, I looked more closely at the source and realized that I can just disable ssl with “use_ssl false”.

 

However, now I’m getting a different error:

 

2015-04-29 21:28:52 +0000 [error]: failed to configure/start sub output s3: can't call S3 API. Please check your aws_key_id / aws_sec_key or s3_region configuration. error = #<AWS::S3::Errors::AccessDenied: <?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code></Error>>

 

I know my keys are correct as they work with s3cmd. One thing suggested is that we might have to use S3 signature version 2, which is what boto needs in order to work. It looks like there is an option in the S3 gem called :s3_signature_version that defaults to v4.

 

How hard would it be to modify the s3 plugin to accept this as a config option for testing?

Christian Hedegaard

unread,
Apr 29, 2015, 7:41:43 PM4/29/15
to flu...@googlegroups.com

Ok here we go:

root@log-01:/opt/td-agent/embedded/bin# ./ruby s3_test.rb                                                                                                                                                              

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/core/client.rb:375:in `return_or_raise': <?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchBucket</Code></Error> (AWS::S3::Errors::NoSuchBucket)

        from /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/core/client.rb:476:in `client_request'

        from (eval):3:in `put_object'

        from /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/s3/s3_object.rb:1765:in `write_with_put_object'

        from /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/s3/s3_object.rb:611:in `write'

        from s3_test.rb:11:in `<main>'

 

root@log-01:/opt/td-agent/embedded/bin# s3cmd mb s3://log_chs

Bucket 's3://log_chs/' created

 

root@log-01:/opt/td-agent/embedded/bin# ./ruby s3_test.rb

[no output.. success?]

 

root@log-01:/opt/td-agent/embedded/bin# s3cmd ls s3://log_chs

2015-04-29 23:40         4   s3://log_chs/gem_test

 

Yup!! Success.

 

Fluentd/td-agent still doesn’t work L

Mr. Fiber

unread,
Apr 29, 2015, 7:51:03 PM4/29/15
to flu...@googlegroups.com
Hmm...

Could you add "p OpenSSL::X509::DEFAULT_CERT_FILE" line in s3_test.rb?
For example,

p OpenSSL::X509::DEFAULT_CERT_FILE
options = {}
# ...

> One thing suggested is that we might have to use S3 signature version 2, which is what boto needs in order to work. It looks like there is an option in the S3 gem called :s3_signature_version that defaults to v4.

Ah, I see. Maybe adding signature_version option is better.

> How hard would it be to modify the s3 plugin to accept this as a config option for testing?

I think adding one line to s3 plugin under /opt/td-agent/embedded/lib/ruby is enough.


I'm not sure AWS SKD's configuration name. Maybe like below?

options[:s3_signature_version] = 2 # or 'v2'?


Christian Hedegaard

unread,
Apr 29, 2015, 8:06:56 PM4/29/15
to flu...@googlegroups.com

So I’ve tried s3_signature_version v3 and v4… Both end up with the same error as before:

2015-04-30 00:02:07 +0000 [error]: failed to configure/start sub output s3: can't call S3 API. Please check your aws_key_id / aws_sec_key or s3_region configuration. error = #<AWS::S3::Errors::AccessDenied: <?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code></Error>>

 

V2 doesn’t seem to be supported by the AWS SDK but I tried it anyways:

 

2015-04-30 00:05:33 +0000 [error]: failed to configure/start sub output s3: can't call S3 API. Please check your aws_key_id / aws_sec_key or s3_region configuration. error = #<AWS::S3::Errors::AccessDenied: <?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code></Error>>

 

Same error L

 

It seems like something is being lost in the translation between the fluentd plugin and the AWS sdk. Since we now know that the AWS SDK works fine via your ruby script. I’m not convinced it cares about the signature_version anymore. That was probably a red herring L

Christian Hedegaard

unread,
Apr 29, 2015, 8:10:11 PM4/29/15
to flu...@googlegroups.com

Here’s the entire traceback:

2015-04-30 00:05:34 +0000 [error]: failed to configure/start sub output s3: can't call S3 API. Please check your aws_key_id / aws_sec_key or s3_region configuration. error = #<AWS::S3::Errors::AccessDenied: <?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code></Error>>

2015-04-30 00:05:34 +0000 [error]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-s3-0.5.7/lib/fluent/plugin/out_s3.rb:154:in `rescue in check_apikeys'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-s3-0.5.7/lib/fluent/plugin/out_s3.rb:150:in `check_apikeys'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-s3-0.5.7/lib/fluent/plugin/out_s3.rb:95:in `start'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-forest-0.3.0/lib/fluent/plugin/out_forest.rb:133:in `block in plant'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-forest-0.3.0/lib/fluent/plugin/out_forest.rb:128:in `synchronize'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-forest-0.3.0/lib/fluent/plugin/out_forest.rb:128:in `plant'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-forest-0.3.0/lib/fluent/plugin/out_forest.rb:168:in `emit'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/event_router.rb:88:in `emit_stream'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:142:in `on_message'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:238:in `call'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:238:in `block in on_read_msgpack'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:237:in `feed_each'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:237:in `on_read_msgpack'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:223:in `call'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:223:in `on_read'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/cool.io-1.3.0/lib/cool.io/io.rb:123:in `on_readable'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/cool.io-1.3.0/lib/cool.io/io.rb:186:in `on_readable'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/cool.io-1.3.0/lib/cool.io/loop.rb:88:in `run_once'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/cool.io-1.3.0/lib/cool.io/loop.rb:88:in `run'

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/plugin/in_forward.rb:91:in `run'

Mr. Fiber

unread,
Apr 30, 2015, 1:51:46 PM4/30/15
to flu...@googlegroups.com
Could you test "check_apikey_on_start false" configuration?
My script tests only write method of AWS SDK.

If it also failed, could you paste the result of "p OpenSSL::X509::DEFAULT_CERT_FILE" in s3_test.rb.


Christian Hedegaard

unread,
Apr 30, 2015, 2:17:43 PM4/30/15
to flu...@googlegroups.com

Tried that. Now we get a new error J

 

2015-04-30 18:15:38 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2015-04-30 18:15:38 +0000 error_class="AWS::S3::Errors::Forbidden" error="AWS::S3::Errors::Forbidden" plugin_id="object:3f9dd2918340"

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/core/client.rb:375:in `return_or_raise'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/core/client.rb:476:in `client_request'

  2015-04-30 18:15:38 +0000 [warn]: (eval):3:in `head_object'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/s3/s3_object.rb:296:in `head'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/s3/s3_object.rb:273:in `exists?'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-s3-0.5.7/lib/fluent/plugin/out_s3.rb:122:in `write'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/buffer.rb:325:in `write_chunk'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/buffer.rb:304:in `pop'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/output.rb:321:in `try_flush'

  2015-04-30 18:15:38 +0000 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.7/lib/fluent/output.rb:140:in `run'

 

Also, regarding the SSL part. I’ve gone ahead and set “use_ssl” to false and we’re ok not using SSL entirely.

 

At this point I guess I’m more concerned about just getting the S3 plugin to work with openstack. Your test script worked, so I see no reason why the plugin wouldn’t.. Hrm..

Christian Hedegaard

unread,
Apr 30, 2015, 2:25:48 PM4/30/15
to flu...@googlegroups.com

Just for trouble-shooting purposes, I went ahead and created the bucket that this would be trying to write objects to.

 

Same error as below after a restart. FYI.

Mr. Fiber

unread,
Apr 30, 2015, 2:51:13 PM4/30/15
to flu...@googlegroups.com
Hmm... very strange.
Could you add 'bucket.objects['gem_test_2'].exists?' code to s3_test.rb?
Code is like below:

bucket.objects['gem_test_2'].exists?                                                                                                                
bucket.objects['gem_test_2'].write('test', :content_type => 'text/plain')

I want to know this API works or not because an error is caused by this API.


Christian Hedegaard

unread,
Apr 30, 2015, 4:36:29 PM4/30/15
to flu...@googlegroups.com

Here we go:

 

root@log-01:/opt/td-agent/embedded/bin# ./ruby s3_test.rb

/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/core/client.rb:375:in `return_or_raise': <?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchBucket</Code></Error> (AWS::S3::Errors::NoSuchBucket)

        from /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/core/client.rb:476:in `client_request'

        from (eval):3:in `put_object'

        from /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/s3/s3_object.rb:1765:in `write_with_put_object'

        from /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/aws-sdk-v1-1.63.0/lib/aws/s3/s3_object.rb:611:in `write'

        from s3_test.rb:12:in `<main>'

 

root@log-01:/opt/td-agent/embedded/bin# s3cmd mb s3://log_chs

Bucket 's3://log_chs/' created

 

root@log-01:/opt/td-agent/embedded/bin# ./ruby s3_test.rb

[no output]

 

root@log-01:/opt/td-agent/embedded/bin# s3cmd ls s3://log_chs

2015-04-30 19:04         4   s3://log_chs/gem_test_2

 

root@log-01:/opt/td-agent/embedded/bin# cat s3_test.rb

require 'aws-sdk-v1'

 

options = {}

options[:access_key_id] = ''

options[:secret_access_key] = ''

options[:s3_endpoint] = 's3-endpoint:8080'

options[:use_ssl] = false

 

s3 = AWS::S3.new(options)

bucket = s3.buckets['log_chs']

bucket.objects['gem_test_2'].exists?                                                                                                                

bucket.objects['gem_test_2'].write('test', :content_type => 'text/plain')

 

Mr. Fiber

unread,
Apr 30, 2015, 4:53:02 PM4/30/15
to flu...@googlegroups.com
Toooo strange...

The remaining different thing is bucket and created object path.
Could you change these values to actual configurations?
I want to know fluentd has a permission or not.
In S3, user sometimes forgot to add a permission or update IAM role setting.

BTW, are there useful logs in your openstack side?


Reply all
Reply to author
Forward
0 new messages