hi,I am facing a situation where bucket object list is more than 400k, gsutil ls -l is getting killed by system as it consumes more memory resource. I have a m1.small (1.7 GB Ram)Can I accomplish listing using gsutil for large list of objects?Or can I ask gsutil to append 1000 entries to a file instead of collecting in memory and then putting it on stdout.Let me know which area to identify in gsutil code if modification of gsutil is the only way to achieve this.Thanks.-Sachin--
You received this message because you are subscribed to the Google Groups "Google Cloud Storage" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gs-discussio...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.