Partitioning Large Sorted Sets in Redis Cluster

1,011 views
Skip to first unread message

hh90

unread,
Dec 21, 2017, 5:16:54 PM12/21/17
to Redis DB
The Redis documentation states that "partitioning granularity is the key, so it is not possible to shard a dataset with a single huge key like a very big sorted set."

Given this, what should someone do if they wish to distribute a large sorted set among nodes in a Redis cluster?

One possible solution we have thought of is to apply hash tags to a sorted set. For example, the key "my_set" would be split into "{x}my_set" and "{y}my_set" where these keys hash to different nodes, and we store set members in both. However it's unclear how to choose hash tags that will hash evenly to different nodes in the cluster.

Dvir Volk

unread,
Dec 25, 2017, 8:51:07 AM12/25/17
to redi...@googlegroups.com
Hi. 

I've done something similar for the distributed version of RediSearch.

Disregarding the other details, here's the trick: If you know how many servers you have more or less, you can use evenly spread slot numbers (0...16384) to make the "partitions" more likely to be evenly spread. 

I've generated a table of the shortest alphanumeric strings that can be put in the curly braces for each possible slot, you can work with that. It won't allow you to resize, so you might want to have more "parittions" than nodes so you can grow. 


--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To unsubscribe from this group and stop receiving emails from it, send an email to redis-db+u...@googlegroups.com.
To post to this group, send email to redi...@googlegroups.com.
Visit this group at https://groups.google.com/group/redis-db.
For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages