Re: [gs-discussion] Importing data from S3?

317 views
Skip to first unread message

Mike Schwartz (Google Storage Team)

unread,
Sep 30, 2012, 4:57:15 PM9/30/12
to gs-dis...@googlegroups.com
Hi Evan,

You can do it using gsutil with a command like:

gsutil -m cp -r s3://your_s3_bucket gs://your_gs_bucket

Note that this copies all the data via the machine where you're running, but at least doesn't require that all the data be downloaded to the local disk: it copies from S3 and then streams back up to Google Cloud Storage.

If you want to do it without copying through your local machine you could try writing an App Engine app to do it.

Mike



On Sun, Sep 30, 2012 at 11:28 AM, Evan Charlton <evanch...@gmail.com> wrote:
I've been searching around and it seems like this isn't possible, but I figured I'd ask here just to be sure:

Is there a way to transfer a bunch of accumulated data from Amazon S3 to Google Cloud Storage? I'd be fine if I had to specify HTTP URLs or something. It doesn't need to be clean, it just needs to work.

Thoughts on how I can do this without a download/reupload flow?

Thanks!
Evan

--
 
 

Evan Charlton

unread,
Sep 30, 2012, 10:05:43 PM9/30/12
to gs-dis...@googlegroups.com, gs-...@google.com
Mike,

Thanks for the response! If I understand that correctly, it means that it's going to be using my local machine's bandwidth, yeah? If so, that's unfortunate because I'm assuming a direct S3 pull from Google would be much faster than my residential internet :-)

I considered writing an AppEngine app to do it, but I was worried about the bandwidth cost. I'll look into it, though!

Evan

Mike Schwartz (Google Storage Team)

unread,
Sep 30, 2012, 10:08:49 PM9/30/12
to gs-dis...@googlegroups.com
Hi Evan,

Yes, that's correct - using gsutil to do the transfer will use your local machine's bandwidth. Writing an App Engine app would avoid that.

Mike



--
 
 

Reply all
Reply to author
Forward
0 new messages