gaierror with S3

289 views
Skip to first unread message

Eric Montgomery

unread,
Mar 16, 2009, 1:26:46 PM3/16/09
to boto-users
I am using boto to run the S3 storage for a couple of django sites.
One site is completed and working and the other is in development and
has started giving me a lot of trouble.

When I try to start the development server, the django-based
S3BotoStorage code calls the boto code to set up the S3 connection
with connection.create_bucket(). This makes its way down to the _mexe
function in boto/connection.py, which is where the code is failing.
The connection seems to be returning an http exception that includes
the error "socket.gaierror: (8, 'nodename nor servname provided, or
not known')". Boto tries to reconnect a few times before giving me
that error, but it has failed every time.

The confusing thing is that the already completed site works fine,
with almost the exact same settings - it just uses a different S3
account.

I have absolutely no idea what this error is, and can't seem to find
much help with a google search.
Any help would be really appreciated.

-Eric

mitch

unread,
Mar 16, 2009, 1:31:50 PM3/16/09
to boto-users
Hi -

Would it be possible to turn on the httplib.py debugging? One way to
do that is to place the following in your ~/.boto file:

[Boto]
debug = 2

Also, configuring the boto logging channel to the DEBUG level might
produce some useful info. I think there is something wrong with the
request being made.

Mitch

Eric Montgomery

unread,
Mar 16, 2009, 1:44:34 PM3/16/09
to boto-users
I assume if ~/.boto does not exist, I should just create it...
Where will the log file be?

Just to do a little debugging on my own, I put a print statement
inside the _mexe function so that I could see the arguments it was
receiving.
It prints out the method, path, data, headers, host and sender:
PUT, /, , {'Date': 'Mon, 16 Mar 2009 17:37:31 GMT', 'Content-Length':
0, 'Authorization': 'AWS <my_access_key>:<my_encrypted_secret_key>',
'User-Agent': 'Boto/1.3a (darwin)'}, <my_bucket_name>, None

For what it is worth, those values are the same as the ones that get
passed to _mexe when I run it on the working site (except of the
course access key, secret key and bucket name are different).

-Eric

Eric Montgomery

unread,
Mar 16, 2009, 2:02:03 PM3/16/09
to boto-users
Well, I just realized I wasn't using the latest version of boto, and
of course, once I updated, everything worked.
Not sure why, but I honestly don't really care for now.

Thanks for your help, Mitch.

-Eric

Eric Montgomery

unread,
Apr 3, 2009, 8:58:30 PM4/3/09
to boto-users
Well, I seem to have somehow reverted back to the same problem with
the gaierror when trying to connect despite being on the latest
version of boto (1.6b).

As I mentioned, I still don't have any problem with the existing site
which uses a different S3 account. I even plugged the Access Key,
Secret Key and Bucket from the working site into the settings.py file
for the new site, and things worked fine. Conversely, when I plugged
the Access Key, Secret Key and Bucket from the new site into the
settings.py file for the working site, it gave me the same gaierror
error.

I've narrowed down the error to the line "connection.request(method,
path, data, headers)" which is just inside the main while loop in the
function _mexe() in the file boto/connection.py. When I add some print
statements to see what's going on, I can see that the arguments to
request() are exactly the same for both the old and new sites except
for the Access Key and Secret Key, so it's almost as if something is
wrong with the new S3 account itself. However, I can easily connect to
both S3 accounts through an app like Transmit.app, so I'm not sure
that's it.

Is there something I have to do to allow boto to access a "new" S3
account that I either did not have to when I set up the old one or
that I have just forgotten? If that's not it, does boto save some S3
account information somewhere that prevents me from going back and
forth between S3 accounts?

mitch

unread,
Apr 3, 2009, 9:48:57 PM4/3/09
to boto-users
Just out of curiosity, what is the name of the problematic bucket.
Or, if you would prefer not to provide the name, does the bucket in
question follow the more restrictive DNS-based rules for bucket names?
See:

http://docs.amazonwebservices.com/AmazonS3/latest/BucketRestrictions.html

for details on the more restrictive names.

Mitch

Eric Montgomery

unread,
Apr 4, 2009, 12:28:40 AM4/4/09
to boto-users
My bucket names are both pretty plain (in fact, the non-working is the
same as the working one except for the addition of 2 letters).

I also narrowed the problem down a bit more, because I was trying to
use the VHostCallingFormat, but when I go back to the default
SubdomainCallingFormat, it works. The VHostCallingFormat works with
the old bucket though, so it's still acting weird.

-Eric

On Apr 3, 9:48 pm, mitch <Mitch.Garn...@gmail.com> wrote:
> Just out of curiosity, what is the name of the problematic bucket.
> Or, if you would prefer not to provide the name, does the bucket in
> question follow the more restrictive DNS-based rules for bucket names?
> See:
>
> http://docs.amazonwebservices.com/AmazonS3/latest/BucketRestrictions....
Reply all
Reply to author
Forward
0 new messages