unzipping images into blobstore

55 views
Skip to first unread message

atarno

unread,
Dec 7, 2011, 7:31:39 AM12/7/11
to Google App Engine
in my app i need to do the following: 1. a zip file with images (jpgs
only right now)and other stuff is uploaded into the BlobStore. 2. an
app engine backend should read the entries from the uploaded zip and
save all images found inside to the BlobStore as stand alone files.

i successfully upload, unzip and save files @ blobstore, but the
images seem to be broken. when i download them from the BlobStore
(simply blobstoreService.serve them) the images have wrong colors, or
displayed partially, or broken in other ways. an attempt to use
ImagesService also throws an exception. i checked the size of the
images before they are zipped and the size of the files unzipped while
written into the blobstore and they look the same. here is my code:

ZipInputStream zis = ...;
ZipEntry entry;
while ((entry =zis.getNextEntry()) !=null)
{
String fileName = entry.getName().toLowerCase();
if(fileName.indexOf(".jpg") != -1 || fileName.indexOf(".jpeg") !=
-1)
{
FileService fileService = FileServiceFactory.getFileService();
String mime = ctx.getMimeType(fileName);//getting mime from
servlet context
AppEngineFile file = fileService.createNewBlobFile(mime,
fileName);
boolean lock = true;
FileWriteChannel writeChannel =
fileService.openWriteChannel(file, lock);
byte[] buffer = new
byte[BlobstoreService.MAX_BLOB_FETCH_SIZE];
while(zis.read(buffer) >= 0)
{
ByteBuffer bb = ByteBuffer.wrap(buffer);
writeChannel.write(bb);
}
writeChannel.closeFinally();
BlobKey coverKey = fileService.getBlobKey(file);
....
}
}

thanks a lot for you time!

André Pankraz

unread,
Dec 7, 2011, 7:59:15 AM12/7/11
to google-a...@googlegroups.com
what if you read less than BlobstoreService.MAX_BLOB_FETCH_SIZE bytes in the last iteration?

does  ByteBuffer bb = ByteBuffer.wrap(buffer);  writeChannel.write(bb); really handle this correctly?

on the other side...the JPG format shouldn't care about some extra bytes? have forgotten...

atarno

unread,
Dec 7, 2011, 8:20:24 AM12/7/11
to Google App Engine
i tried with smaller chunks too (1024b), but this didn't help.
meanwhile i wrote some work-around that seems to work, but still, i
don't understand why the solution above does not.

int read;
ByteArrayOutputStream baos = new ByteArrayOutputStream();

while((read = zis.read()) >= 0)
{
baos.write(read);
if(baos.size() ==
BlobstoreService.MAX_BLOB_FETCH_SIZE)
{
ByteBuffer bb =
ByteBuffer.wrap(baos.toByteArray());
writeChannel.write(bb);
baos = new ByteArrayOutputStream();
}
}
if(baos.size() > 0)
{
ByteBuffer bb = ByteBuffer.wrap(baos.toByteArray());
writeChannel.write(bb);
}

André Pankraz

unread,
Dec 7, 2011, 8:43:31 AM12/7/11
to google-a...@googlegroups.com
like i wrote...in the last iteration you read less than MAX_BLOB_FETCH_SIZE data but write all this bytes to the writer.
the trailing bytes are not nulled and there is no bytes-read end marker or something like that.

so your resulting blobs are larger than the unzipped content and allways of length MAX_BLOB_FETCH_SIZE * iterations.

first example should work with ByteBuffer bb = ByteBuffer.wrap(buffer,0,read);

JPG itself has some kind of EOF marker...but who knows what happens with this many subformats.
Reply all
Reply to author
Forward
0 new messages