newbie do server clean up if user exits browser instead of shutting notebook down

232 views
Skip to first unread message

Andy Davidson

unread,
Mar 15, 2016, 3:28:59 PM3/15/16
to jup...@googlegroups.com
I started using IPython notebook about 2 years ago with spark (spark now uses Jupiter). Notebooks ROCK!

I now need to create a interactive dashboard for my customers. It will make it easy for them to select data and do some graphing. I currently use the standard notebook server and am starting to look into the multiuser server.

In general spark can consume huge amounts of resource. I want to make sure users do not accidentally create a “denial of service attack” by forgetting to shutdown notebooks

Does the server do any kind of automatic clean up?

Kind regards

Andy


Jonathan Frederic

unread,
Mar 15, 2016, 10:56:19 PM3/15/16
to jup...@googlegroups.com

I don't think the server does itself, there may be a user contributed extension to do that.  Tmpnb closes entire notebook servers after a given timespan of idleness

--
You received this message because you are subscribed to the Google Groups "Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jupyter+u...@googlegroups.com.
To post to this group, send email to jup...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jupyter/D30DB082.34324%25Andy%40SantaCruzIntegration.com.
For more options, visit https://groups.google.com/d/optout.

Andy Davidson

unread,
Mar 16, 2016, 1:25:19 PM3/16/16
to jup...@googlegroups.com
Thanks Jon

I image this is a common problem for anyone that runs a multiuser notebook server. Spark jobs may run for a long time and use an enormous amount of cluster resources

Andy

I’ll cross post on the spark mail list and see what they have to say  

Dan Allan

unread,
Mar 17, 2016, 2:54:36 PM3/17/16
to Project Jupyter
We have had this problem in the JupyterHub at Brookhaven National Lab as well. Users come and go, and many of them don't figure out that they need to shut down their kernels.

As you may know, there is a JupyterHub configuration setting that allows admins to enter other users' servers and shut down kernels selectively without shutting down their servers entirely. Depending on the ratio of users to admins, this might be a better solution than an automated shut down, which risks interrupting important work.

Dan

Andy Davidson

unread,
Mar 17, 2016, 6:05:59 PM3/17/16
to jup...@googlegroups.com
Hi Dan

Thanks. Do you think it would be possible to automate this some how? I would hate to have to hire someone to deal with this issue.

Andy

P.s. I did not receive a reply from the spark mail list


Thomas Kluyver

unread,
Mar 17, 2016, 6:38:26 PM3/17/16
to Project Jupyter
Jupyterhub has an API so scripts can do things like this. I know there's an example somewhere of a daemon/cron job that talks to Jupyterhub and kills notebook servers that have been inactive for more than a set time. One of Jess, Kyle or Min should be able to point you to this.

Thomas

MinRK

unread,
Mar 18, 2016, 5:49:16 AM3/18/16
to jup...@googlegroups.com

The cull-idle-servers script is in the JupyterHub examples. If you want to cull idle kernels of single-user servers, that’s more work since this information is not tracked at a kernel level.

-MinRK


Andy Davidson

unread,
Mar 18, 2016, 1:02:09 PM3/18/16
to jup...@googlegroups.com

Jafar Sharif

unread,
Feb 6, 2019, 5:39:48 AM2/6/19
to Project Jupyter

Hi Min,
     I have created a config file to cull the kernels that are idle for more than one hour. But it is not killing the notebooks that are idle for more than one hour. Attaching the config file.
jupyter_notebook_config (1).py

Jafar Sharif

unread,
Feb 13, 2019, 4:41:53 AM2/13/19
to Project Jupyter
Hi Andy,
        Is your problem got solved??. Please let me know the way i am facing the same issue.
Reply all
Reply to author
Forward
0 new messages