'SBOX_FATAL_MEMORY_EXCEEDED' in Chrome

1,355 views
Skip to first unread message

Austin Hernandez

unread,
Apr 24, 2021, 2:09:34 PM4/24/21
to Project Jupyter
I was running a high-throughput computations of thousands of items in a Jupyter notebook. At some point, the program crashed and I am unable to open up the jupyter notebook again; I get the 'SBOX_FATAL_MEMORY_EXCEEDED' error in Chrome. I have cleared all outputs, restarted the kernel, restarted the server, and made copies/downloaded + uploaded the notebook, all to no avail. Upon downloading the notebook, I see that the size of it is 143000 KB! Obviously that is far too large, but I have no idea how it just ballooned in size after the crash... It was working fine up until now. Any help is appreciated with getting this notebook back as I cannot even access the code, which is pretty important. 

Anandraj Jaganathan

unread,
Apr 24, 2021, 2:54:31 PM4/24/21
to jup...@googlegroups.com
Hi Friends,
I need a help 
I need to install pyspark on jupyter notebook.. can someone please help me.. please 

Sent from my iPhone

On Apr 24, 2021, at 2:09 PM, Austin Hernandez <amh5...@gmail.com> wrote:

I was running a high-throughput computations of thousands of items in a Jupyter notebook. At some point, the program crashed and I am unable to open up the jupyter notebook again; I get the 'SBOX_FATAL_MEMORY_EXCEEDED' error in Chrome. I have cleared all outputs, restarted the kernel, restarted the server, and made copies/downloaded + uploaded the notebook, all to no avail. Upon downloading the notebook, I see that the size of it is 143000 KB! Obviously that is far too large, but I have no idea how it just ballooned in size after the crash... It was working fine up until now. Any help is appreciated with getting this notebook back as I cannot even access the code, which is pretty important. 

--
You received this message because you are subscribed to the Google Groups "Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jupyter+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jupyter/b8e5cabf-68fc-4807-bb35-b5b70eac9befn%40googlegroups.com.

Robert Schroll

unread,
Apr 26, 2021, 4:51:38 PM4/26/21
to jup...@googlegroups.com, Project Jupyter
On Apr 24 2021, at 11:09 am, Austin Hernandez <amh5...@gmail.com> wrote:
I was running a high-throughput computations of thousands of items in a Jupyter notebook. At some point, the program crashed and I am unable to open up the jupyter notebook again; I get the 'SBOX_FATAL_MEMORY_EXCEEDED' error in Chrome. I have cleared all outputs, restarted the kernel, restarted the server, and made copies/downloaded + uploaded the notebook, all to no avail. Upon downloading the notebook, I see that the size of it is 143000 KB! Obviously that is far too large, but I have no idea how it just ballooned in size after the crash...

I know you said that you cleared the outputs, but a huge file size is generally a sign that some sort of output has been saved.  If you haven't done so already, I'd try nbconvert to strip output at the command line: https://mindtrove.info/jupyter-tidbit-clear-outputs/
If that doesn't work, you could try using nbcovert to a script, which I think should leave you only with your code: https://nbconvert.readthedocs.io/en/latest/usage.html#convert-script
Alternatively, a .ipynb file is just a JSON file, so you can pull out your favorite JSON parsing tools to explore it and extract the code you need saved.

Good luck!
Robert

Reply all
Reply to author
Forward
0 new messages