Hi,
I am looking for the most efficient way to store multiple large 2D arrays that will allow large numbers of updates (increments) per element from multiple processes and servers.
The arrays would be of the size 100k*100k or greater. A new instance of the array would be created every 15 minutes as we need to record this information over time intervals.
There could be up to 10 of these arrays being updated at any one time .
1. Keep 2D Array in application/server memory and write out to DB every X period(15 minutes or so). Each server would need to do the same and merge data with queries (map/reduce, aggregation framework or similar).
Pros: Simple DB write. No 2D array updates in DB.
Cons: Server crashes and the data is lost. More application memory used
2. Store data in a grid matrix, 100k documents with an array containing 100k elements/documents per 2D array.
Pros: Easy to implement and update values. Non-volatile data.
Cons: Huge number of rows and columns required over time. 100k rows every 15 minutes. Potential write/update lock issue?
3. Each array is stored as a document as follows:
{"2DArray":[
{
"0":[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,...],
"1":[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,...],
"2":[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,...],
...
...
}
]
}
--
You received this message because you are subscribed to the Google
Groups "mongodb-user" group.
To post to this group, send email to mongod...@googlegroups.com
To unsubscribe from this group, send email to
mongodb-user...@googlegroups.com
See also the IRC channel -- freenode.net#mongodb
--