Hi guys
Have you guys encountered this case? The tutorial of gevent said that, multiprocessing.queue will hang when monkey patched, I want to know is there any better solution for this case? Thanks!
Details about question:
It's a producer and worker workflow with multiprocessing and gevent. I want to share some data with Queue of multiprocessing between Process. And at the same time, gevent producer and worker get data and put task to the Queue.
task1_producer will generate some data and put them into q1 task1_worker comsumes the data from task q1 and put generated data into q2 and q3.
Then the task2 does.
But question here is that, data has been inserted into q3 and q4, but nothing happened with task2. If add some logs in task2, you will find that, q3 is empty. Why would this happened? What's the best method to share data between process?
myPipe, yourPipe = msgpipe.CreatePipeEndpoints(msgpipe.T_ANON)
messagePipe = msgpipe.MessagePipe(myPipe)
messagePipe.AddHandlers({'done_obj': done_obj})helper = gipc.start_process(helper_main, args=(yourPipe))
def main( _mypipe):
messagePipe = msgpipe.MessagePipe(GV.myPipe)
messagePipe.AddHandlers({"add_obj":add_obj})
Comparison:
Queue consuming time 4.76837158203125e-06
redis rpop consuming time 0.046524763107299805
* Redis server runs in the same machine