Hi All.
I am running a benchmark of memory consumption of sorted set in Redis.
My goal is to optimize memory usage for 1.000.000 sorted sets each has 300 entries.
SortedSet :
all:speed:year:1380210701349 - key
50:1380210701349 (entries)
80:1380210701349
I calculated that each sorted set has size:
28 byte (key itself) + (4byte * 300) scores + (13byte * 300)entries = 5128 bytes.
1.000.000 * 5128 = ~ 4.7 GB.
But executing insertion of the 1000.000 sorted sets I got memory usage 40GB !!!!
Questions:
1) Why I get so high memory usage overhead , it is closed to x10?
2) I changed in conf file these parameters:
zset-max-ziplist-entries 310
zset-max-ziplist-value 6000
but the memory consumption didn't changed.
When I check the object :
> DEBUG OBJECT ALL:speed:year:1380210701349
I get such result
Value at:0x7f3c2ef22d10 refcount:1 encoding:skiplist serializedlength:5292 lru:1706190 lru_seconds_idle:50
Does it means that after changing ziplist my sorted sets still not zipped? If yes why after changing the conf parameters and restart Redis , I still didn't get zipped sorted sets? What should i do to resolve / debug this behavior.
I am using Redis 2.6.9 , CentOs operationg system running on VM.
Thanks
Oleg