--
You received this message because you are subscribed to the Google Groups "qi4j-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to qi4j-dev+u...@googlegroups.com.
To post to this group, send email to qi4j...@googlegroups.com.
Visit this group at http://groups.google.com/group/qi4j-dev.
For more options, visit https://groups.google.com/d/optout.
Thanks for the reply, however the reason I thought that it might be due to the existing bug is, in my ManyAssociation when I don't use a byte[] it works perfectly and all the associated entities stored perfectly, but once I start adding byte array in this associated entity it throws me the said exception. however if I only associate only one entity with byte[] it works fine. but at the time I associate the second entity with byte array it throws the decoding exception from Base64Encoder. But as you wrote may be Paul can help me figuring out the problem cause. And thanks a lot that you figured out some time for the issue I am facing.
Thanking you,
Jaydatt
I dig into some more detail of this issue and I figured out that this is not due to ManyAssociation but this is because when we create an entity with binary data and after that when we try to read this entity than this Deserialize exception occurred.
Sample code:
@Test
public void failStoringBinaryData() throws Exception{
UnitOfWork uow = module.newUnitOfWork();
TestEntity2 testEntity2 = uow.newEntity(TestEntity2.class);
String entityId = testEntity2.identity().get();
testEntity2.binaryProperty().set("test".getBytes()); // we are setting binary data here
uow.complete();
uow = module.newUnitOfWork();
TestEntity2 testEntity2Other = uow.get(TestEntity2.class, entityId);
testEntity2Other.binaryProperty().get(); // This is where the exception is generated.
uow.complete();
}
public interface TestEntity2
extends EntityComposite
{
@Optional
Property<String> property();
@Optional
Property<byte[]> binaryProperty();
}
}
--
You received this message because you are subscribed to the Google Groups "qi4j-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to qi4j-dev+u...@googlegroups.com.
To post to this group, send email to qi4j...@googlegroups.com.
Visit this group at http://groups.google.com/group/qi4j-dev.
For more options, visit https://groups.google.com/d/optout.
I have tried with List<Byte> instead of byte[] as per you mentioned in one of your previous email chain, and it seems to be working with List<Byte>, so should I use this List<Byte> instead of byte[]?
Thank you,
Jaydatt
--
You received this message because you are subscribed to the Google Groups "qi4j-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to qi4j-dev+u...@googlegroups.com.
To post to this group, send email to qi4j...@googlegroups.com.
Visit this group at http://groups.google.com/group/qi4j-dev.
For more options, visit https://groups.google.com/d/optout.
I am sorry but I didn't get that what do you mean by "32 byte overhead", can you please give me some understanding.
Also what do you think should be a proper solution of this case, as I want to store the binary data?
Niclas, thanks a lot again for your time for answering me.
Thank you,
Jaydatt
A List<Byte> property will be stored as a JSON array, one value per entry in the List. Hence the "32 bytes overhead per byte". So, that's not a good idea for binary data storage that exceed a few bytes.
Thanks a lot for your time and the provided solution about "Storing it to Base64 String". It also make sense to store the bigger file on the file system and give their references to an entity of entity store. And for small binary chunks we can store it using Base64 encoded string in entities as well. I can see only one negative point when storing on to file system is it will increase manageability and maintainability , where we need to take care while moving data to for example some another machine, also we need to keep it strict rule that file should not be deleted anyhow by a garbage collector or so on. But for bigger file system we should not store it to database is also true
Thanks again for your time,
Jaydatt
Thanks for the suggestions here for storing the big binaries, it really make sense to store large binaries on dedicated sql table or a mongodb collection. We are also thinking to create a kind of java plug-in to deal with this big binaries and where ever we need to store the binary data we can simply use this plugin for storing this big binaries. So, with plug in as you suggested we can use the "Filesystem" or a sql table or any other option to store the big binaries.
Thank you,
Jaydatt