Bulk Insert with hiredis

183 views
Skip to first unread message

Mahm131

unread,
Aug 11, 2015, 7:05:00 AM8/11/15
to Redis DB
Hi,
I am trying to do a bulk insert using the hiredis-c-dirver.
for this goal I use the appendcommand where I append a number of keys. my calls look like this
             redisAppendCommand(context,"SET %d %d", key, value);

 Done with appending the keys the program calls redisGetReply
             redisGetReply(context,NULL);

The problem is that it works for a number of inserts. This means the first keys will be added. However this behavior is not always the same. Having 30K (30000) keys to be added raises this probelm.
Having 10K keys to be added runs as expected.

I think this would be a memory issue and I need to allocate more memory to my server.
Is that right? or could this be another problem? if it is the problem, how could I repair this??
Thanks

Didier Spezia

unread,
Aug 11, 2015, 11:04:39 AM8/11/15
to Redis DB

You cannot run very large pipelines if you read the output only at the end.
Otherwise memory will accumulate in communication buffers until Redis
decides that the buffer is too big and close the connection.
It is unrelated to the memory allocated to the server.

Ideally, you should read the first replies as soon as possible, but since you
use the synchronous API it is not convenient. The easy way to fix it is
to loop on batches of n items (for instance n=1000). Instead of running
1 roundtrip of 30K, try to run 30 roundtrips of 1K.

Regards,
Didier.
 
Reply all
Reply to author
Forward
0 new messages