Hi,
Is there a way to programmably determine the current maximum object
size supported by GCS? With AWS S3, something like the following
would work:
s3curl.pl --id=s3 --put=large --
http://a-test-bucket.s3.amazonaws.com/noupload
-v
where large is a sparse file say of 6GB in size on the local HDD,
which can be almost instantly generated. But 5TB is too big for my
computer's HDD :-(
The returned error is like the following:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>EntityTooLarge</Code>
<Message>Your proposed upload exceeds the maximum allowed size</
Message>
<ProposedSize>6442450944</ProposedSize>
<RequestId>9B8BCF31341C1F50</RequestId>
<HostId>IObGkbu2dvvhfxOGooPflX0pJ69pwMeFrpWD1zMzP7oCHOXBRYtO9DLPOyOoBxif</
HostId>
<MaxSizeAllowed>5368709120</MaxSizeAllowed>
</Error>
Would appreciate a quick hint as to how to do it.
Regards,
Zack
> As I mentioned, we support files up to 5 TB, so uploading a 1 GB file will
> not break due to constraints on our side. You might want to try either our
> resumable upload mechanism (
https://developers.google.com/storage/docs/developer-guide#resumable), or
> if you're running out of memory, perhaps a change to your code so that you
> use a streaming upload (HTTP Chunked Transfer Encoding). In general, we
> recommend resumable uploads for larger files.
>
[...]