Before MKL update: $ jupyter-kernelspec list
Available kernels: python2 /home/rpatel/miniconda2/lib/python2.7/site-packages/ipykernel/resources
After MKL update: $ jupyter-kernelspec list
Available kernels: python2 /home/rpatel/.local/share/jupyter/kernels/python2
I also changed the following in jupyterhub_config.py file. But still seeing the issue.
...
#c.Spawner.mem_guarantee = 8G
--
You received this message because you are subscribed to a topic in the Google Groups "Project Jupyter" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jupyter/hY0Jq-E5Ki0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jupyter+unsubscribe@googlegroups.com.
To post to this group, send email to jup...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jupyter/CAOvn4qiR9cRz4ydTya2nwPivkbjsS_yuT4RwwqXO3_T6oSti5g%40mail.gmail.com.
Argg. Thank you Thomas. It did run longer(17 min as opposed to 10 min) this time after I un-commented those lines, but still saw the same issue. Is there any limitation in Jupyter that it cannot handle more than certain GB data or query more than certain million or billion rows from post gre SQL DB ?
Its strange because when I do run the SQL script from SQL work bench locally on my machine(which is less powerful than the server on which jupyter is running), I do get the resulting rows.