Performance of bid data sizes

45 views
Skip to first unread message

Winspear

unread,
Oct 10, 2013, 5:24:06 PM10/10/13
to jour...@googlegroups.com
Appreciate the guys behind journal.io :-)

Our use case involves journaling records of sizes upto ~30MB. Does any body know or used journal.io with record sizes like these? 

Thanks


Sergio Bossa

unread,
Oct 11, 2013, 4:56:38 AM10/11/13
to jour...@googlegroups.com
Thanks for your kind words :)

The main problem with such big record size is that record data is kept
in memory before being batched to disk, and in case of speculative
reads.
Speculative reads shouldn't actually be a huge problem, unless you
keep in memory a lot of Location objects.
So in the end, it's all about testing with your write volume and batch
size, and see if it works for you.

Let us know how it goes ;)

Cheers,

Sergio B.
> --
> You received this message because you are subscribed to the Google Groups
> "Journal.IO" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to journalio+...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.



--
Sergio Bossa
http://www.linkedin.com/in/sergiob
Reply all
Reply to author
Forward
0 new messages