Maximum File size limit

8,744 views
Skip to first unread message

Ankit Shah

unread,
May 23, 2012, 3:06:04 PM5/23/12
to Google Cloud Storage
Hi,

I am uploading file from one of my .net based application, and when i
upload near more then 200mb's it is giving me error of "object
reference not to set instance of an object". i am getting this error
from google storage api while adding bucket..

So kindly tell me if this is file size issue or any other issue ? and
also maximum size file i can upload to the Google Cloud Storage
through application..

Thanks,
Ankit.

Navneet (Google)

unread,
May 23, 2012, 3:22:56 PM5/23/12
to gs-dis...@googlegroups.com
Hi,

The maximum file size you can upload to Google Cloud Storage is 5 TB, although your upload must complete within a week so it may be smaller if you have a slower internet connection. It doesn't sound like either of those is the problem in your case.

From the information in your email, it doesn't sound like the error is coming from our servers. Please send the request and respons headers to gs-...@google.com if you'd like us to take a look.

Thanks,
- Navneet


--
You received this message because you are subscribed to the Google Groups "Google Cloud Storage" group.
To post to this group, send email to gs-dis...@googlegroups.com.
To unsubscribe from this group, send email to gs-discussio...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/gs-discussion?hl=en.


Ankit Shah

unread,
May 31, 2012, 8:49:47 AM5/31/12
to Google Cloud Storage
Hi again,

Now i am able to upload few more size files like up to 700MB's..But
when i tried above 1gb file it throws me "Out Of memory" exception and
that is obvious reason that it won't allow this much bigger file..

so now i have decided to read the bigger files into chunks and then
write them when uploading...so my question is that will Google
storage allow me to upload the file in chunks?

Kindly help me on this.
Thanks in advance..

On May 24, 12:22 am, "Navneet (Google)" <gs-t...@google.com> wrote:
> Hi,
>
> The maximum file size you can upload to Google Cloud Storage is 5 TB,
> although your upload must complete within a week so it may be smaller if
> you have a slower internet connection. It doesn't sound like either of
> those is the problem in your case.
>
> From the information in your email, it doesn't sound like the error is
> coming from our servers. Please send the request and respons headers to
> gs-t...@google.com if you'd like us to take a look.

Sheng Torres

unread,
May 31, 2012, 12:47:34 PM5/31/12
to gs-dis...@googlegroups.com
Hi,

  In your web config do you set your
<httpRuntime maxRequestLength="value" executionTimeout="value"/>

<security>
            <requestFiltering>
                <requestLimits maxAllowedContentLength="value"/>
            </requestFiltering>
        </security>



Navneet (Google)

unread,
May 31, 2012, 1:16:28 PM5/31/12
to gs-dis...@googlegroups.com
As I mentioned, we support files up to 5 TB, so uploading a 1 GB file will not break due to constraints on our side. You might want to try either our resumable upload mechanism (https://developers.google.com/storage/docs/developer-guide#resumable), or if you're running out of memory, perhaps a change to your code so that you use a streaming upload (HTTP Chunked Transfer Encoding). In general, we recommend resumable uploads for larger files.

zackp

unread,
Jun 2, 2012, 6:11:50 PM6/2/12
to Google Cloud Storage
Hi,

Is there a way to programmably determine the current maximum object
size supported by GCS? With AWS S3, something like the following
would work:

s3curl.pl --id=s3 --put=large -- http://a-test-bucket.s3.amazonaws.com/noupload
-v

where large is a sparse file say of 6GB in size on the local HDD,
which can be almost instantly generated. But 5TB is too big for my
computer's HDD :-(

The returned error is like the following:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>EntityTooLarge</Code>
<Message>Your proposed upload exceeds the maximum allowed size</
Message>
<ProposedSize>6442450944</ProposedSize>
<RequestId>9B8BCF31341C1F50</RequestId>

<HostId>IObGkbu2dvvhfxOGooPflX0pJ69pwMeFrpWD1zMzP7oCHOXBRYtO9DLPOyOoBxif</
HostId>
<MaxSizeAllowed>5368709120</MaxSizeAllowed>
</Error>

Would appreciate a quick hint as to how to do it.

Regards,

Zack

> As I mentioned, we support files up to 5 TB, so uploading a 1 GB file will
> not break due to constraints on our side. You might want to try either our
> resumable upload mechanism (https://developers.google.com/storage/docs/developer-guide#resumable), or
> if you're running out of memory, perhaps a change to your code so that you
> use a streaming upload (HTTP Chunked Transfer Encoding). In general, we
> recommend resumable uploads for larger files.
>
[...]

Navneet (Google)

unread,
Jun 5, 2012, 1:58:48 PM6/5/12
to gs-dis...@googlegroups.com
The current maximum object size supported by GCS is 5 TB. However, if you're uploading something that large, you may want to use the resumable upload protocol rather than a straight PUT. What drives the need to determine this programmatically?

zackp

unread,
Jun 5, 2012, 2:25:30 PM6/5/12
to Google Cloud Storage
Hi Navneet,

> The current maximum object size supported by GCS is 5 TB. However, if
> you're uploading something that large, you may want to use the resumable
> upload protocol rather than a straight PUT.

Yes. The 5TB info has been posted by you and others from Google quite
a few times. I am aware of it. The limit seems to be changing over
time, from the 100GB in the Labs days, to 1TB, and now to 5TB

> What drives the need to determine this programmatically?

Two reasons:

0. Personal curiosity (can this be done?)
1. Smarter clients (ah, GCS now supports this limit, lets take
advantage of it)

Regards,

Zack

Navneet (Google)

unread,
Jun 11, 2012, 1:15:15 PM6/11/12
to gs-dis...@googlegroups.com
Hi Zack,

Unfortunately, I can't think of a good way to do this programmatically.

Thanks,
Navneet

Reply all
Reply to author
Forward
0 new messages