Hi,
Using the java API 1.3.2
My app includes an import stage where a large data file (7-10mb) is
uploaded using the BlobStore service. The processing of this file is
broken up into a series of Tasks, which are executed sequentially
(each task queues the next upon completion) .
The datafile are not organized for random access, so each Task has to
read through the entire datafile, no matter how small I make each
individual task. This seems to work, but the dashboard says that I'm
maxing out at 100% the "Blobstore Bytes Read" with 86,400 of 86,400. I
can't find this quota documented anywhere.
Again the tasks are running fine, but the big red bar makes me a bit
nervous.
Are there limits on reading bytes from Blobs? Is it really bytes or is
it number of reads?
I've written an InputStream wrapper for BlobStoreService that buffers
in chunks of 512k. Is this value obviously too high/too low? Any
advice on an optimal buffer size?
Many thanks,
Alex
--
You received this message because you are subscribed to the Google Groups "Google App Engine" group.
To post to this group, send email to
google-a...@googlegroups.com.
To unsubscribe from this group, send email to
google-appengi...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/google-appengine?hl=en.