User Processes not releasing Memory

1,000 views
Skip to first unread message

eom...@anaconda.com

unread,
Feb 20, 2019, 12:12:15 PM2/20/19
to Project Jupyter
I have been seeing these processes called "ZMQbg/1" over and over that do
not have any CPU activity. I'm trying to get an understanding of what these
processes are and how can they be managed besides killing them.

I have seen these processes on multiple Notebook users. Some users jobs error out
due to out of RAM. From the information I've found, these look like ZMQ processes. I believe Jupyter uses ZMQ pipes to communicate in the background hence the processes.

Can anyone confirm/validate this and if so, what's the recommended way to manage them?

Roland Weber

unread,
Feb 21, 2019, 2:00:36 AM2/21/19
to Project Jupyter
Those are background threads of libzmq, as mentioned here:

Jupyter uses ZMQ to connect to the running kernels on the local machine. There's not much to be done for the threads besides passing messages, so they won't accumulate a lot of CPU time. Restarting the kernel(s) will close the ZMQ connections and open new ones, maybe that has an effect on the background threads and/or their memory consumption? That was suggested here:

Don't know why these threads should consume much memory... maybe the kernel or Jupyter are unresponsive, and ZMQ messages get buffered beyond reasonable limits? Jupyter does act as a bridge between the kernels (via ZMQ) and frontends (via Websockets). I think some message buffering was added months ago for cases where the websocket connection is lost and gets re-established later, so the browser can still retrieve the cell output. But I wouldn't expect cell output to be big enough to cause OOM problems. Also, the Spyder issue linked above mentions there was no significant kernel activity.

Hope that helps,
  Roland


Reply all
Reply to author
Forward
0 new messages