Difficulties using aws_s3 to upload files

1,328 views
Skip to first unread message

Alan Sparks

unread,
Oct 9, 2017, 2:09:03 PM10/9/17
to Ansible Project
ansible 2.4.0.0
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/home/sparksa/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/dist-packages/ansible
  executable location = /usr/bin/ansible
  python version = 2.7.12 (default, Nov 19 2016, 06:48:10) [GCC 5.4.0 20160609]

Also, botocore-1.7.25, boto-2.48.0, boto3-1.4.4-py2.7

I'm having a lot of trouble trying to use the "aws_s3" module, getting a 403 error every time I run it.   For instance:
    - aws_s3:
        profile: "{{ aws_profile }}"
        region: "{{ aws_region }}"
        bucket: "{{ aws_profile }}-{{ cluster_name }}-bigiq"
        object: "credentials.txt"
        #src: "{{ tempfile.path }}"
        src: foo.txt
        mode: put
        ignore_nonexistent_bucket: True
      register: s3object
    - debug: var=s3object

The task fails with:
The full traceback is:
Traceback (most recent call last):
  File "/tmp/ansible_b8Si5z/ansible_module_aws_s3.py", line 792, in <module>
    main()
  File "/tmp/ansible_b8Si5z/ansible_module_aws_s3.py", line 671, in main
    if module.md5(src) == keysum(module, s3, bucket, obj):
  File "/tmp/ansible_b8Si5z/ansible_module_aws_s3.py", line 277, in keysum
    key_check = s3.head_object(Bucket=bucket, Key=obj)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 310, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 599, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

Without the "ignore_nonexistent_bucket" option, it also fails with a 403, but a little differently:
The full traceback is:
Traceback (most recent call last):
  File "/tmp/ansible_TnLiij/ansible_module_aws_s3.py", line 289, in bucket_check
    s3.head_bucket(Bucket=bucket)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 310, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 599, in _make_api_call
    raise error_class(parsed_response, operation_name)
ClientError: An error occurred (403) when calling the HeadBucket operation: Forbidden

I've tried creating the bucket with Ansible (s3_bucket), and by hand, same result.   I've also tried uploading the file using awscli - I can upload the file using the same profile successfully.

Can anyone offer any advice what I may be doing wrong?   SInce I can upload or list buckets using awscli, it wouldn't seem to be a real permissions problem...


Alan Sparks

unread,
Oct 9, 2017, 2:19:14 PM10/9/17
to Ansible Project
Also, I should mention, the profile I am using in this task has Administrator access level.

Alan Sparks

unread,
Oct 9, 2017, 5:37:04 PM10/9/17
to Ansible Project
Some more testing... and I've found that if I use the "old" auth options (aws_access_key, aws_secret_key, security_profile) it works.   It throws the 403 if I remove those and put back the "profile" option that is advertised as supported, and works with the s3_bucket module.

Is this true?  Is the "aws_s3" module broken, and doesn't support "profile"?

Alan Sparks

unread,
Oct 9, 2017, 6:47:31 PM10/9/17
to Ansible Project
Pretty sure I found a bug in the aws_s3 module.   Looking at modules/cloud/amazon/aws_s3.py:
602     # Look at s3_url and tweak connection settings
603     # if connecting to RGW, Walrus or fakes3
604     for key in ['validate_certs', 'security_token', 'profile_name']:
605         aws_connect_kwargs.pop(key, None)

The comment implies tweaking if not S3, but there's no conditional for that.   If I copy the module and comment out this block, profiles work.   Seems like a mistake.
-Alan

Alan Sparks

unread,
Oct 9, 2017, 6:57:05 PM10/9/17
to Ansible Project
Someone beat me to finding this:

Reply all
Reply to author
Forward
0 new messages