Using MongoDB to store immutable data?

395 views
Skip to first unread message

Johan Haleby

unread,
Jan 17, 2016, 5:29:32 PM1/17/16
to mongodb-user
Hi, 

This is essentially a complimentarily questions to my stackoverflow question. We 're investigating options to store and read a lot of immutable data (events) and I'd like some feedback on whether MongoDB would be a good fit.

Requirements:
  1. We'll need to store about 10 events per seconds (but the rate will increase). Each event is small, about 1 Kb. 
  2. A really important requirement is that we need to be able to replay all events in the order they were inserted. 
Questions:
  1. Is MongoDB a good fit for this? I.e. storing a set of ever growing immutable data?
  2. I've read here that MongoDB have a limit of 32 Mb when sorting documents using cursors. For us it would be fine to read all data in insertion order (like a table scan) so an explicit sort might not be necessary?
  3. It looks like capped collections would fit this very well if it weren't for the fact that they are capped :) Would capped collections still be the way to go in MongoDB if we make sure to never hit the capped limit (and create a new capped before doing so)? 
A big benefit of using MongoDB is that we might also be able to query the data (due to Mongo's schema-less nature) which would be very attractive. But this is not a requirement and if it negatively impacts performance or design decisions for the two requirements mentioned above we're happy to use another database (perhaps another MongoDB instance/cluster) for this.

Regards,
/Johan


Reply all
Reply to author
Forward
0 new messages