Jupyter on VR

294 views
Skip to first unread message

Guillermo Valle

unread,
Sep 11, 2020, 1:39:47 PM9/11/20
to Teaching with Jupyter Notebooks
Hi,

I just wanted to ask the community here if there would be interest in an idea I have.

I am thinking of creating a Jupyter frontend in a social VR game, NeosVR in partciular. This game offers a lot of functionality to collaborate in a VR/immersive environment, and it's a dream of mine to enable scientists and educators to use this awesome tool, to improve how they do remote teaching and collaboration. 


and an outdated video but which i think shows the feeling of using neos quite well https://www.youtube.com/watch?v=vUBaZQNn3UY 

So I was curious if you could use jupyter kernels and notebooks from this environment, in a way that allows for visualization/interactivity, would you? If yes/no, why yes/no?

Thanks!
Guillermo

Wes Turner

unread,
Sep 11, 2020, 5:14:16 PM9/11/20
to Guillermo Valle, Teaching with Jupyter Notebooks
Cool idea! Interested to learn more about your use cases

## RISE, Thebe

> RISE allows you to instantly turn your Jupyter Notebooks into a slideshow. No out-of-band conversion is needed, switch from jupyter notebook to a live reveal.js-based slideshow in a single keystroke, and back.

RISE allows you to execute code within the reveal.js slides; whereas export as reveal saves static slides.

"Option to save notebook as reveal slides (.html)"

> What is possible with static RISE reveal.js slide outputs? Could cells be run with a configurable kernel w/ e.g. thebe? And/or, Would it be necessary to pass a repo URL & nb path to a binderhub server such that a container with the appropriate dependencies could be provisioned?

^^ "Running code against a local kernel connection #199" (w/ Thebe)

Thebe

> Thebe turns your static HTML pages into interactive ones, powered by a kernel. It is the evolution of the original Thebe project with javascript APIs provided by JupyterLab.
>
> [...]
> It is static for now. You can activate Thebe by pressing the button below. This will ask mybinder.org for a Python kernel, and turn the code cell into an interactive one with outputs!


Thebe + a BinderHub instance is one way to avoid sharing shell access through a shared Jupyter Notebook / JupyterLab session.


## Generic VNC Support

OpenCobalt VR - amongst many innovative features - has in-world VNC remote desktop support for whatever app:

## Multi-user collaborative real-time Jupyter notebooks

There are a number of solutions for multi-user collaborative Jupyter notebooks:

- CoCalc


  > This Real Time Collaboration monorepo contains current work on Real Time collaboration for use in JupyterLab and other Jupyter applications.


- are Jupyter notebooks supported in VSCode live share?
  "Collaboration (multi user) support"

- FWIU, Colab doesn't support collaborative multi-user real-time features due to the shutdown of the Realtime API (and nobody has reimplemented support on top of e.g. Firebase)


- audit logs are essential when users are sharing shell access with or without network access.


## VR rich text output

Is there need for something like _repr_vrml_?

VRML is likely superseded by newer open standards?

Jupyter supports e.g. _repr_html_, _repr_svg_, _repr_json_, and _repr_mimebundle_; so it should be possible to add an 3D object renderer widget with something like three.js without adding any new _repr_methods_




--
You received this message because you are subscribed to the Google Groups "Teaching with Jupyter Notebooks" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jupyter-educat...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jupyter-education/06136338-908c-49c5-8b9f-79066cc0e9fbn%40googlegroups.com.

Brian Granger

unread,
Sep 14, 2020, 2:58:32 PM9/14/20
to Wes Turner, Guillermo Valle, Teaching with Jupyter Notebooks
As a regular user of VR (for sim racing) I am interested to see where this goes, but I don't have any particular insights or ideas, and not sure how I would use Jupyter in a VR context.



--
Brian E. Granger

Principal Technical Program Manager, AWS AI Platform (brgr...@amazon.com)
On Leave - Professor of Physics and Data Science, Cal Poly
@ellisonbg on GitHub

Guillermo Valle

unread,
Sep 22, 2020, 4:34:06 PM9/22/20
to Teaching with Jupyter Notebooks
Hi, 

thank you for your replies!

I am not sure of all ways that it could be used, in part because of my limited imagination^^, but also because VR is very much a developing technology. I can think though of some ways it could be used already today:

1) Teaching/presentations. Some people are starting to do these in VR due to the increased immersion, and ability to use shared-space tools like boards, 3D models, spaces, etc. Having Jupyter embedded in these would add extra opportunities for teaching/live coding.
2) Collaborative coding. Personally, some of my most fun experiences in VR by far have been collaboratively working on projects inside NeosVR, using its in-game buildin and scripting tools. From building worlds, to prototyping a neural net, to serendipitously playing with 2D planar curves, which then we turned into a music visualizer, etc, etc. To explain how it feels:, it's basically like pair programming, but where your program can be laid around a space as you want, and you can change size, and comment/sketch on the program, etc. Supercharging the current in-VR programming with a full scientific tool could allow for doing this as part of existing projects using python/jupyter, as well as using the more powerful tools in the jupyter ecosystem.
3) packaging pieces of python code to create more advanced in-VR game objects that can interact with each other. A "library" could be "imported" by spawning a gem in the world that now makes other objects gain a special funcionality, which runs in a python kernel. This falls under a more experimental "using jupyter for VR" rather than the more "VR for jupyter" of the above 2 ideas, but the two concepts mix as well.
4) use the power of 3D game engines to create new types of visualizations, and the increasing set of VR sensors/periferals to explore new types of interactivity too.

Currently, there are several limitations too: cost of VR systems, lack of certain tools (for example not many good systems for typing in VR), as well as more quantiative improvements that are needed, like better latency in audio, better comfort of headsets, but VR manufacturers are working hard at these, as well as bringing cost down (cloud VR could be a big help here, and some people are already using it with shadow+quest).

I have already begun working on the system. Here are a couple of videos giving an idea of the current state of the system https://www.youtube.com/watch?v=W2_1HyXo18Y https://www.youtube.com/watch?v=wRavYJ6F1JM

Brian (or anyone), I could show you in VR if you wanted to see and perhaps get a better idea of the possible uses. Neos also has a desktop mode by the way.
There are a couple of other VR platforms I also want to explore and compare.

--------------------------

By the way, a longer term thing I am working on is integrating ML tools into VR, for example to learn from human movement data, and I would like to use the integration with Jupyter for that project, which I will be part-time working on during my postdoc most likely. My dream is to make the best tools to allow scientists, educators, and other people to collaborate, and then work on making/researching stuff that's only possible with these new infrastructures (e.g. socialVR) ^^ . So if anyone is interested in these things, I want to meet you!

Wes Turner

unread,
Sep 22, 2020, 9:59:14 PM9/22/20
to Guillermo Valle, Teaching with Jupyter Notebooks
Yellowbrick Visualizers may be something to integrate with for learning exercises. It may be that visual intuition is more useful than well-defined criteria for e.g. evolutionary AutoML.



You're likely already familiar with ipywidgets. 

CadQuery is a declarative parametric CAD library written in Python on top of OpenCASCADE (It was formerly based on FreeCAD)

> An extension to render cadquery objects in JupyterLab via pythreejs


Slicer is a "free cross-platform open-source medical image processing and visualization system"
https://github.com/Slicer/SlicerJupyter

> Extension for 3D Slicer that allows the application to be used from Jupyter notebook

PyViz lists a number of visualization tools:

> SciVis Libraries¶
Most of the libraries listed at PyViz.org fall into the InfoVis (Information Visualization) category of tools, visualizing arbitrary and potentially abstract types of information, typically in 2D or 2D+time plots with axes and numerical scales. Tools in the separate SciVis (Scientific Visualization) category focus on visualizing physically situated gridded data in 3D and 3D+time, often without spatial axes and instead providing an immersive visual experience of real-world physical datasets (see Weiskopf et al for a comparison). Desktop-GUI targeted SciVis tools build on the OpenGL graphics standard, while browser-based web applications usually leverage the related WebGL graphics standard.

> [List of tools]

> itk-jupyter-widgets, based on the Visualization Toolkit for JavaScript vtk.js and the Insight Toolkit (ITK), provides interactive 3D widgets for Jupyter to visualize and analyze images, point sets, and meshes

> Interactive Jupyter widgets to visualize images, point sets, and meshes in 2D and 3D

 ...

Again, jupyter-rtc:


Wes Turner

unread,
Sep 22, 2020, 10:11:24 PM9/22/20
to Guillermo Valle, Teaching with Jupyter Notebooks
Mayavi mlab (VTK, ...)
https://docs.enthought.com/mayavi/mayavi/mlab.html


- looks like there's not yet a JupyterLab extension for mayavi2 mlab

...

JupyterLab extension development

"Extension Developer Guide"

"JupyterLab Extensions by Examples"

...

AFAIU, Google Cardboard (or similar DIY patterns) is the lowest cost VR headset (given a decent phone)?


...

These docs are way out of date by now, but the wikipedia concept URIs are there: 

Wes Turner

unread,
Sep 29, 2020, 3:52:53 PM9/29/20
to Guillermo Valle, Teaching with Jupyter Notebooks
https://github.com/QuantStack/ipygany

> 3-D Scientific Visualization in the Jupyter Notebook

- GPU support
- VTK support by way of https://docs.pyvista.org/

Wes Turner

unread,
Nov 3, 2020, 1:56:27 PM11/3/20
to Guillermo Valle, Teaching with Jupyter Notebooks
K3D-jupyter:
> K3D lets you create 3D plots backed by WebGL with high-level API (surfaces, isosurfaces, voxels, mesh, cloud points, vtk objects, volume renderer, colormaps, etc). The primary aim of K3D-jupyter is to be easy for use as stand alone package like matplotlib, but also to allow interoperation with existing libraries as VTK

Aaron Watters

unread,
Nov 5, 2020, 9:46:16 AM11/5/20
to Wes Turner, Guillermo Valle, Teaching with Jupyter Notebooks
Hi folks.  Also please see https://github.com/AaronWatters/feedWebGL2
which provides WebGL2 isosurfaces and other 3d features directly in the browser interface with not external binaries required.
Unfortunately, the "demos" page is quite old and needs to be updated.
Please launch a Binder using the binder link and look at the example notebooks.

Thanks and please post in "issue" if you have any questions or concerns,

-- Aaron Watters


Reply all
Reply to author
Forward
0 new messages