You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to mongodb-user
Hi,
This is essentially a complimentarily questions to my stackoverflow question. We 're investigating options to store and read a lot of immutable data (events) and I'd like some feedback on whether MongoDB would be a good fit.
Requirements:
We'll need to store about 10 events per seconds (but the rate will increase). Each event is small, about 1 Kb.
A really important requirement is that we need to be able to replay all events in the order they were inserted.
Questions:
Is MongoDB a good fit for this? I.e. storing a set of ever growing immutable data?
I've read here that MongoDB have a limit of 32 Mb when sorting documents using cursors. For us it would be fine to read all data in insertion order (like a table scan) so an explicit sort might not be necessary?
It looks like capped collections would fit this very well if it weren't for the fact that they are capped :) Would capped collections still be the way to go in MongoDB if we make sure to never hit the capped limit (and create a new capped before doing so)?
A big benefit of using MongoDB is that we might also be able to query the data (due to Mongo's schema-less nature) which would be very attractive. But this is not a requirement and if it negatively impacts performance or design decisions for the two requirements mentioned above we're happy to use another database (perhaps another MongoDB instance/cluster) for this.