We prepare to store our data which about 2000 million entries in
voldemort (2 million entries increase daily). Size of key is 6 bytes,
with 16 ~ 96 bytes value. After estimating in 3 nodes, 2 replicas,
put 1 million entries, we got about 150Mb in every node. Whole data
will be 300Gb in every node.
Is there anyone have performance data about bdb je that hold such big
data ? ( 2000*2/3 million entries in one bdb je enviroment. )
Should i shard data in several stores (bdb.one.env.per.store=true) ?
How many entries in one store is efficient?
Any suggest is appreciated. Thank you.