This deficiency in Jupyter is not going to make the resources available to maintain the sagenotebook, so I'm afraid the reality is going to be we're going to have to do without it. You could take a look if cocalc does a better job. It has its own notebook and I think they wrote/adapted their own ipynb frontend too.
Otherwise, a work-around would be to structure long-running computations a little differently to write state to a separate file rather than depending on logging into the notebook. The kernel doesn't seem to lose state; it's just output, so once the process has completed (which you could then see from the file), the notebook is ready for interaction again.
Another workaround is to not close the browser but instead use something like VNC or xpra to disconnect/reconnect (remotely) to your running browser.
It's pretty obvious that different behaviour of Jupyter would be the better solution, but if the deficiency has been around for a long time and you don't have definite plans to fix it yourself (I wouldn't know how to), the reality is that this will likely not change.