Hi,
We have a fix for this problem now, in gsutil release 3.18. You can get it
by running:
gsutil update
Thank you for your patience. Please let us know if you continue to have
problems after applying this update.
Mike
Original Message Follows:
------------------------
From: sgm <
sogma...@gmail.com>
Subject: Re: [gs-discussion] Large Uploads
Date: Sun, 16 Sep 2012 01:50:35 -0700 (PDT)
> Thanks Mike! btw been doing some more tests and it seems to get stuck at
> "Catching up md5 for resumed upload" (on small files). feels like a
leak.
>
> On Saturday, September 15, 2012 11:43:25 AM UTC-7, Mike Schwartz (Google
> Storage Team) wrote:
> >
> > Hi,
> >
> > We are investigating this problem and will follow up once we have
> > information about it.
> >
> > Thanks,
> >
> > Mike
> >
> >
> >
> > On Sat, Sep 15, 2012 at 10:12 AM, sgm <
sogma...@gmail.com
<javascript:>>wrote:
> >
> >> I'm trying to upload a big directory (1TB) with thousands of 'small'
> >> files (nohup gsutil -m cp -R -q /dir/* gs://bucket_name &). But after
a
> >> couple of hours I can see the CPU usage skyrocketing to 100% and the
> >> network traffic dropping to 0 (for 24hrs). My guess is that gsutil is
> >> trying to compute an md5... but none of the files is bigger than
20mb.
> >> There is nothing useful on nohup.out. Any ideas or suggestions?