Re: [gs-discussion] ls command getting killed

56 views
Skip to first unread message

Mike Schwartz (Google Storage Team)

unread,
Mar 4, 2013, 12:08:50 PM3/4/13
to gs-dis...@googlegroups.com
Hi Sachin,

What version of gsutil are you running? Some time back we changed the implementation to stream bucket listings and process results on the fly. Prior to that, it would build a complete listing in memory, which could result in running out of memory for large bucket listings.

Please run gsutil update, and then try your command again, and let us know if you still have the same problem.

Thanks,

Mike



On Mon, Mar 4, 2013 at 3:45 AM, sachin kale <brigh...@gmail.com> wrote:
hi,

I am facing a situation where bucket object list is more than 400k, gsutil ls -l is getting killed by system as it consumes more memory resource. I have a m1.small (1.7 GB Ram)
Can I accomplish listing using gsutil for large list of objects?

Or can I ask gsutil to append 1000 entries to a file instead of collecting in memory and then putting it on stdout.

Let me know which area to identify in gsutil code if modification of gsutil is the only way to achieve this. 
 

Thanks.
-Sachin

--
You received this message because you are subscribed to the Google Groups "Google Cloud Storage" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gs-discussio...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Reply all
Reply to author
Forward
0 new messages