Re: [gs-discussion] Resumable Upload Exception

670 views
Skip to first unread message

Google Cloud Storage Team

unread,
Oct 31, 2012, 9:14:15 AM10/31/12
to gs-dis...@googlegroups.com
Hi Cathy,

I assume you ran 'gsutil config' to setup your access credentials. A 403 error usually indicates an access control problem. Just to eliminate a permissions problem between the credentials used with your code and gsutil, I'd suggest using gsutil to create a new bucket and retrying the gsutil upload to an object in the new bucket. If that doesn't help, please send a trace of the session (minus any sensitive data, like OAuth tokens), using the gsutil -D option to gs-...@google.com and we'll take a look.

Thanks,

Marc
Google Cloud Storage Team

On Mon, Oct 29, 2012 at 11:54 AM, Cathy <itdevtea...@gmail.com> wrote:
I'm trying to upload a 30GB zipped file so that I can insert the rows into a Google BigQuery table.  I've attempted the upload a few times from the API interface but the file never loaded and I did not receive an error message.  So, I download the gsutil tool and installed. I am receiving the error message "ResumableUploadException:  Got status 403 from attempt to start resumable upload.  Aborting. 

-- 

Cathy

unread,
Nov 2, 2012, 12:12:21 PM11/2/12
to gs-dis...@googlegroups.com, gs-...@google.com
I'm not writing an app.  I'm running the CP command from the gsutil command line.  I'll try the gsutil config.  My company does not want to invest to much time and money in BigQuery and Cloud Storage until we can upload and query 1B rows.

Cathy

unread,
Nov 2, 2012, 1:59:06 PM11/2/12
to gs-dis...@googlegroups.com, gs-...@google.com
Ok I ran the gsutil config and that worked!  Thanks a bunch!

Cathy

unread,
Nov 16, 2012, 10:39:21 AM11/16/12
to gs-dis...@googlegroups.com, gs-...@google.com
I was able to start the upload, but it ran for over a week and finally aborted with an error "ResumableUploadException:  Too many resumable upload attempts failed without progress.  You might try this upload again later.  I have a 279GB file that I am compressing.  This is the command I used:

python gsutil cp -z txt \\filepath gs://url

Mike Schwartz (Google Storage Team)

unread,
Nov 16, 2012, 11:27:35 AM11/16/12
to gs-dis...@googlegroups.com
Hi Cathy,

Can you please tell us the real command line you used (i.e., with real file name and bucket/object name)? If you'd prefer not to have that show up on gs-discussion, you can send to gs-...@google.com. Thanks,

Mike



--
 
 

Cathy

unread,
Nov 26, 2012, 9:20:45 AM11/26/12
to gs-dis...@googlegroups.com, gs-...@google.com
C:\gsutil>python gsutil cp -z txt \\crpssdw01\g$\BigQuery\BigQuery.csv gs://susser

Copying file://\\crpssdw01\g$\BigQuery\BigQuery.csv [Content-Type=application/octet-stream]...
Catching up md5 for resumed upload
Catching up md5 for resumed upload
Catching up md5 for resumed upload
Catching up md5 for resumed upload
Catching up md5 for resumed upload
ResumableUploadException: Too many resumable upload attempts failed without progress. You might try this upload again later.

The upload runs for several days and shows progress (nearly complete).
Message has been deleted

Cathy

unread,
Dec 11, 2012, 10:02:56 AM12/11/12
to gs-dis...@googlegroups.com, gs-...@google.com
Mike, I uploaded the command I was using on Nov 26.  Do you have any input/answers for me to be able to complete the upload?

Thanks,
Cathy
Reply all
Reply to author
Forward
0 new messages