--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To unsubscribe from this group and stop receiving emails from it, send an email to redis-db+unsubscribe@googlegroups.com.
To post to this group, send email to redi...@googlegroups.com.
Visit this group at https://groups.google.com/group/redis-db.
For more options, visit https://groups.google.com/d/optout.
Hello Tuco,HSCAN is the answer although it appears as a part of your inquiry. HKEYS, as you've noted, can become an expensive operation once a Hash exceeds a certain size. The *SCAN family of commands are designed to circumvent this by employing a stateless client-side cursor to iterate of the elements.Itamar
On Mon, Jan 23, 2017 at 2:46 PM, Tuco <rahul....@gmail.com> wrote:
Hi,We have stored the data in redis hash data structure, and hashes kind of represent what is stored in the database.Some hashes have the number of keys in a couple of millions.When we have to get all the data from redis for a particular hash, instead of doing hgetall(since hgetall returns data in a few GBs), we do hkeys, and then iterate over the fields and do hget fields in batches of 1000.But the problem is hkeys itself takes a couple of seconds, which blocks the redis server which we are keen to avoid.Is there a way to iterate over hkeys, like hscan but for fields in a key, , or any other solution to this?Thanks
--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To unsubscribe from this group and stop receiving emails from it, send an email to redis-db+u...@googlegroups.com.