OpenVR/XR Integration (Prototyping)

493 views
Skip to first unread message

Gareth Francis

unread,
May 18, 2021, 3:32:48 PM5/18/21
to vsg-users : VulkanSceneGraph Developer Discussion Group

Hi all,

Quite a while ago now I tried to put together an OpenVR example (Linux, HTC Vive), made some progress and ended up getting stuck.

I've managed to dig up what I had, sadly it crashes during vkCreateInstance now, expect I might need to go back to a clean slate. The extremely prototype-y code is here: https://github.com/geefr/vsg-vr-prototype

The bit of it that did work is reading hmd & controller positions, and updating camera transforms/controller models within the scene. Other than migrating off my 'vrhelp' classes there's nothing too complicated on the openvr side of things, imagine that would want to be in a vsg utility library of some kind.

The point I was stuck on before was how to present images to the headset - Basically the render path need the 2 views rendering into vkImages, then those are handed off to the vr compositor. I think I was trying to add SubmitOpenVRCommand to perform the hand-off.
(More detail on the vulkan-openvr side of things: https://github.com/ValveSoftware/openvr/tree/master/samples)


So, that's where I got to, if I can find the time I'll be spending some time on this. Any tips/input on things would be great, specifically on rendering the scene into the images (raw access to the vulkan pointers/ids needed), or design thoughts on how it would all be laid out.

In general I think targetting OpenXR would the the thing to do, as I understand it valve aren't spending time on OpenVR any more (Just what I had at the time/had some easy wrappers for)





Robert Osfield

unread,
May 18, 2021, 4:20:15 PM5/18/21
to vsg-...@googlegroups.com
Thanks for the link to your work on OpenVR.  I haven't got a modern headset unfortunately, just an early Oculus dev system.  Will need to find a good excuse to get a modern headset :-)

I have some work to complete tomorrow, once that's finished I'll have a look at your code.

Cheers,
Robert.


Gareth Francis

unread,
May 18, 2021, 4:31:50 PM5/18/21
to vsg-...@googlegroups.com
No rush, as it really doesn't work yet :) I was around 300 commits behind on vsg, and so far it's just enough to make it compile again. I think it'll work with the latest openvr runtime but I haven't tested yet.

You might find the note about steamvr.vrsettings useful - Installing steamvr (provides the openvr/xr runtime) and forcing it to use a null hmd driver should give a pure software view.

Not sure if that's related to my crash at all, or what values it gives for hmd positioning. I think there's some input emulators out there that could be useful for development, as openvr has an api for implementing tracked devices (custom trackers, light guns, gloves, whatever you fancy)

--
You received this message because you are subscribed to the Google Groups "vsg-users : VulkanSceneGraph Developer Discussion Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vsg-users+...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/vsg-users/CAFN7Y%2BXx3evQkZm2Mh9U7xMyjSB0KCCpAt6DZ5TyH7VBKxxpCw%40mail.gmail.com.


--
----
Gareth Francis
www.gfrancisdev.co.uk

Gareth Francis

unread,
May 22, 2021, 5:30:17 PM5/22/21
to vsg-...@googlegroups.com
Okay spent some time on it, got a little update and something working:

E2A8YDEXIAAs-SQ.jpeg
Basic presentation to headset is working, surprisingly simple in the end.
Testing should be possible on any Steamvr/openvr system, in theory it'll build on windows. The null driver for steamvr works for viewing, but couldn't find anything simple to feed in mock device positions.

Bunch of bugs/issues listed on the repo but it will certainly be possible to integrate. The vsg-specific bits are really just rendering into 2 VkImages, and converting device matrices through to scene elements.
I've definitely got my matrices wrong in some way as there's lighting quirks, and had to fiddle axes around in blender to get things the right way up..
Tracking is a bit delayed, rendering is laggy. At the moment I've set it up so the headset is a sub-graph of the desktop render, but they shouldn't be bound together (say if vsync is on the headset should still run at 90+Hz).

Will come back to this in a week or so, for now calling it a proof of concept, code is messy but if anyone's interested feel free to pick through (https://github.com/geefr/vsg-vr-prototype).




Gareth Francis

unread,
Jun 5, 2021, 11:32:15 AM6/5/21
to vsg-...@googlegroups.com
Hi all,

There's been some progress on this - Latest state is a little less messy, can display basic scenes the same as a desktop application, and I'm fairly sure I've got the world space setup correctly.
Current setup for loading models is creation in blender, export to gltf, run through vsgconv. All seems to work well there..

It should build & run on windows now with a little fiddling, mainly developing on Linux for now.
Example should demonstrate the basic needs for a real integration - Think it could probably be wrapped up in a vsg::VRViewer or similar class, providing the hmd view and optional desktop mirror window..


The big issue remaining is how my view matrix interacts with the lighting:
To make the tracking work I added a rotation to map vsg world -> openvr world, and a y-flip as part of the view matrix.
But, in the shader the light direction is hardcoded in (i assume) the vsg world space. So if I want the view to be correct the 'sun' is now horizontal to the scene.

So, either I'll need the light positions to be taken from the scene (where I can transform them before shading), or I somehow need to move the axes swap to after shading..Not sure if that's even possible to do.

Fairly stumped on the last bit, so any suggestions would be helpful. Not sure what the status of lighting is yet, but seems to be mostly hardcoded in vsgExchange?


Adrian Przekwas

unread,
Jun 5, 2021, 2:02:01 PM6/5/21
to vsg-users : VulkanSceneGraph Developer Discussion Group
Hi,

Could be "laggy HMD tracking" issue caused by lack of asynchronous reprojection on Linux and Nvidia?
You could check it:
grep -i async ~/.steam/steam/logs/vrcompositor.txt

Btw, I'm the same guy who submitted the issue https://github.com/geefr/vsg-vr-prototype/issues/1

Gareth Francis

unread,
Jun 5, 2021, 3:54:07 PM6/5/21
to vsg-...@googlegroups.com
Hiya,

I don't think so, but it's possible. Will have to compare to a windows build. Perfomance is great so far so it shouldn't be reprojecting at all (It was a bit earlier, but now a steady 2ms/frame for me).

When I had issues before it was caused by taking too long between waiting for poses and presenting the rendered images. Solution would be moving waitGetPoses around in the rendering sequence,
or using the more advanced timing system in openvr (Think it puts a timer on the gpu, lets it do some pose prediction based on average frame time..)

For the lighting thing I suspect it could be reproduced without vr, could be an issue when just loading a model and rotating it (like the teapot model?) if the shaders contain hardcoded positions/spaces..Wondering if that issue is specific to converted gltf/similar, something that hits that codepath in vsgXchange..



Robert Osfield

unread,
Jun 5, 2021, 4:33:11 PM6/5/21
to vsg-...@googlegroups.com
HI Garath,

The lighting issue is likely to be something resolvable once I've completed the positional state work, I've discussed in the vsg-users thread on this topic.  Once there is positional state support in the core then the we should be able to update the vsgXchange/Assimp and vsgXchange/OSG integration to support light sources positioned in the world, or relative to the camera position

To wrap up the OpenVR integration I would suggest something like a vsgVR library with would depend upon OpenVR and the VSG, providing the viewer setup/configuration, or a dedicated viewer.  I will need to review your code to know best this might be done, alas I'm stretched a bit thin right now so won't be able to look at it right away,

Cheers.
Robert.


Gareth Francis

unread,
Jun 5, 2021, 5:10:54 PM6/5/21
to vsg-users : VulkanSceneGraph Developer Discussion Group
Hi Robert,

Ah okay, I won't worry about the lighting then.
The current code isn't especially clean (all in the application rather than classes/etc). No rush on the review as I'm also quite busy.

I think the breakdown would be the camera setup, some container nodes in the scene (bound to controllers/tracked devices), maybe events for the controllers, and presentation calls in the render loop.
I think a viewer makes sense for that, but maybe one that's primarily vr rather than desktop window + vr as an addition.

Will probably want to abstract the openvr calls away, for future openxr/whatever the oculus sdk is. Currently that's an old wrapper I wrote, so needs porting, deletion of glm and such.


..Tracking performance on windows is good, no lag. So yeah quite possibly that's linux/nvidia quirks.

Thanks
Gareth

Adrian Przekwas

unread,
Jun 5, 2021, 5:19:38 PM6/5/21
to vsg-users : VulkanSceneGraph Developer Discussion Group
Gareth,

I think waitGetPoses is in the right position,  since it blocks the loop until 3 ms before Vblank and then applies rotational reprojection (IIRC). Have you tried getting position of the HMD and controllers directly from waitGetPoses.
Instead of:
vr->waitGetPoses();
and asking for poses in a different part of code, do something like this:

vr->WaitGetPoses(m_rTrackedDevicePose, vr::k_unMaxTrackedDeviceCount, nullptr, 0 ); //I'm not sure if there should stay 0, or some float time value
if ( m_rTrackedDevicePose[vr::k_unTrackedDeviceIndex_Hmd].bPoseIsValid )
{
    vr::TrackedDevicePose_t mPose = m_rTrackedDevicePose[vr::k_unTrackedDeviceIndex_Hmd];
   auto p = mPose.mDeviceToAbsoluteTracking.m;
}

Robert,
Gareth mentioned idea of using OpenXR eventually, and something like vsgXR (instead of vsgVR) would be great IMO. Especially, since OpenXR needs a lot more of boilerplate than OpenVR and in many aspect seems to be similar to Vulkan (good specification, truly vendor independent, validation layers etc.).

Gareth,
Are you able run any XR code on Nvidia/Linux? Few month ago that was impossible due lacking driver features.

Gareth Francis

unread,
Jun 5, 2021, 5:41:28 PM6/5/21
to vsg-users : VulkanSceneGraph Developer Discussion Group
Hi Adrian,

That should already happening - The 'vr' pointer in this case is a (ugly) wrapper class, internally it gets poses for all connected devices (hmd + controllers) during waitGetPoses. I don't have any extra trackers/peripherals but it should pick those up as well (example ignores them for now).

Haven't tried XR at all yet, but it's on my list to learn at some point. Steamvr is set as the openxr runtime if that means anything.
Worst case I'll just develop on windows until the linux situation is ready, makes very little difference for me.

Thanks
Gareth

Robert Osfield

unread,
Jun 6, 2021, 6:40:45 AM6/6/21
to vsg-...@googlegroups.com
Chipping in from a rather disconnected viewpoint - I haven't been on VR side for a few years.  I just did a quick search online to refresh myself.  OpenXR being a Khronos API and now a standard would be the natural route for VR and the VSG.  However, it does require the hardware/OS support combinations to be in place to make it work - I can't comment on these practicalities as I haven't done hands-on testing since the early days of Oculus.

It may be able to create a vsgXR library than can have either OpenXR or OpenVR backends, but it may be just easier to have vsgXR and vsgVR libraries.  If OpenVR is no longer being developed and is going to be fully replaced by OpenXR then it might be that OpenVR integration can just be seen as a stepping stone to/proof of concept for vsgXR, and left in unpolished state.

On the laginess front, with Vukan you can select double or triple buffering.  You can select this via the vsg::WindowTraits, from the vsgviewer.cpp example:

  if (arguments.read("--double-buffer")) windowTraits->swapchainPreferences.imageCount = 2;
  if (arguments.read("--triple-buffer")) windowTraits->swapchainPreferences.imageCount = 3; // default

With triple buffering you can seen high fps but will have high latency,

The other thing to look for is it's possible to update uniform data on the GPU before the previous frame has completed rendering, so you can end up with the next frames updates appearing on the previous frames rendering.  Buffering the copying of data is one way of avoiding this.  This is a topic that the positional state work touches upon and has to resolve - the vsgclip example correctly handles this issue.

Push constants by contrast don't have this issue as they are stored directly in the command buffer.

Cheers,
Robert.

Gareth Francis

unread,
Jul 9, 2021, 12:35:35 PM7/9/21
to vsg-...@googlegroups.com
Hi all,

Had a bit more time on this today, and it's now a semi-workable library with a vsg-like interface, should be a suitable base for any api discussion/comment.
Repository has been renamed, so it's here now: https://github.com/geefr/vsgvr

The only backend for now is openvr, but there's a 'VRContext' concept for later OpenXR support,  tracked device/controller classes aren't bound to any specific implementation.
The lag aspect: Haven't looked any further, but steamvr/nvidia have just got async reprojection on Linux, in theory that'll reduce any issues, or the suggested buffering setup (OpenVR presentation isn't overly similar to desktop setups though, limited control of swapchain/etc)

The important info - The usage:
- Make a scene as normal
- create a vsgvr::VRViewer instead of a normal Viewer
- The viewer makes it own internal Window (for now)
- call VRViewer::createCommandGraphsForView, then assignRecordAndSubmit...
Everything else should be the same as desktop applications (or at least, I managed to quickly port a basic sample over).

Thanks
Gareth






--
You received this message because you are subscribed to the Google Groups "vsg-users : VulkanSceneGraph Developer Discussion Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vsg-users+...@googlegroups.com.

Adrian

unread,
Jul 10, 2021, 5:39:40 PM7/10/21
to vsg-users : VulkanSceneGraph Developer Discussion Group
Hi Gareth,

I did some basic testing of the new release, and it looks really good.
It does not crash on RADV/Linux anymore. Also I cannot confirm most issues listed on github: HMD tracking is not laggy, there is no segfault on exit. I can confirm lighting issue only.
Performance is quite good. I replaced world.vsgt with NASA DIY Mars Rover model https://github.com/nasa-jpl/open-source-rover as stress test. And I was able to get 10-11 ms per frame (Radeon 5700XT and Valve Index).

Gareth Francis

unread,
Feb 5, 2022, 9:17:26 AM2/5/22
to vsg-users : VulkanSceneGraph Developer Discussion Group

Spent some time on vsgvr today, for the following:
- Fixing the build for recent vsg updates
- Updating to reflect the inverted depth buffer - vsgvr will likely only work with the default setup, so may need to revisit this later
- Testing out the vsgXchange lighting changes, as long as lights are defined in the models, and exported through gltf it seems to work perfectly now

Overall it's working great, tested on Windows but it should be fine on Linux as well (my steamvr is misbehaving again).
The way it's integrated probably isn't great, so best to call it a functional prototype for the moment.

@robert: In case you hadn't spotted it yet, assimp_pbr_frag.cpp in vsgXchange needs splitting into shorter raw-strings. Not sure if there's an easy way to catch those without doing a windows build..

Robert Osfield

unread,
Feb 5, 2022, 11:17:53 AM2/5/22
to vsg-...@googlegroups.com
HI Gareth,


On Sat, 5 Feb 2022 at 14:17, Gareth Francis <gfranc...@gmail.com> wrote:

Spent some time on vsgvr today, for the following:
- Fixing the build for recent vsg updates
- Updating to reflect the inverted depth buffer - vsgvr will likely only work with the default setup, so may need to revisit this later
- Testing out the vsgXchange lighting changes, as long as lights are defined in the models, and exported through gltf it seems to work perfectly now
 
Overall it's working great, tested on Windows but it should be fine on Linux as well (my steamvr is misbehaving again).
The way it's integrated probably isn't great, so best to call it a functional prototype for the moment.

Thanks for the update, results are looking good.  I haven't reviewed the code so can't yet provide any insights of how best to take it beyond the prototype stage.  In the last few days I've been working on vsg::RenderPass adding support for vkRenderPass2 API which has multiview support, something that would be an interesting addition to the VR side.

In the next couple of months I'd like to get myself a HMD, as I just have a Linux dev system it'll need decent Linux support.  Happy to take suggestions.
 
@robert: In case you hadn't spotted it yet, assimp_pbr_frag.cpp in vsgXchange needs splitting into shorter raw-strings. Not sure if there's an easy way to catch those without doing a windows build..

Thanks for the note.  I've spotted this VisualStudio specific problem via the VulkanSceneGraph automated builds, as the vsg::Builder uses the same shaders as the vsgXchange::Assimp.  I had to add handling of long strings into the vsgXchange::cpp writer, so it now breaks the string up into chunks to workaround this VisualStudio issue.  I've now checked in an update of the shader.  The way I do it is:

    vsgconv ~/Dev/vsgExamples/data/shaders/assimp_pbr.frag assimp_pbr_frag.cpp --nc

The --nc is a no shader compile hint so that the #Pragmatic Shader Composition is able to work.

This month my plan is to unify these shaders, so the local ones in vsgXchange::Assimp won't be needed.

Cheers,
Robert.

Gareth Francis

unread,
Feb 5, 2022, 11:45:31 AM2/5/22
to vsg-...@googlegroups.com
Gotcha, I think a move to OpenXR would need a rework of my render pass setup anyhow. With OpenVR we render to images/textures and hand off to the runtime, but it looks like OpenXR provides a swapchain for each display(eye).

My current setup is a viewer/window for the HMD + render graph going into some images..A 'viewer' is probably a sensible abstraction of an HMD, but then there's also 'tracked' objects, input bindings, etc to think about.

As for Linux VR recommendations - I'm no expert these days but the original vive or valve index are the goto options, in theory any steamvr headset works.
Do have a look at the steamvr/Linux known issues though, while it works I've found it to be quirky, doubly so on laptops/systems with hybrid graphics etc. There's some features that work on windows but not Linux, various ways the setup can just not-recognise the headset..

(No rush at all from me on this, just playing around with things, and don't have much time for projects)


--
You received this message because you are subscribed to a topic in the Google Groups "vsg-users : VulkanSceneGraph Developer Discussion Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/vsg-users/BAXxwuPpbgc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to vsg-users+...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/vsg-users/CAFN7Y%2BWio6uq617Mo4g35aAEQwaFfmftH9wVM6HkNVcp_47vqQ%40mail.gmail.com.

Tim Moore

unread,
Feb 7, 2022, 6:24:58 AM2/7/22
to vsg-...@googlegroups.com
On Sat, Feb 5, 2022 at 3:17 PM Gareth Francis <gfranc...@gmail.com> wrote:

- Updating to reflect the inverted depth buffer - vsgvr will likely only work with the default setup, so may need to revisit this later
 I'm not an expert on OpenXR, but I see that Microsoft recommends using reverse depth in their "OpenXR app best practices," (https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/openxr-best-practices) so it should be possible to use a reverse-mapped depth buffer directly.

Tim
--
----
Gareth Francis
www.gfrancisdev.co.uk

--
You received this message because you are subscribed to the Google Groups "vsg-users : VulkanSceneGraph Developer Discussion Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vsg-users+...@googlegroups.com.

Gareth Francis

unread,
May 13, 2022, 2:19:18 PM5/13/22
to vsg-users : VulkanSceneGraph Developer Discussion Group

I've just updated vsgvr to work with the latest vsg, updated my test models via vsgXChange.
Didn't have any issues, but there might be something off with the lighting positions, or spotlight angles, unsure.

The OpenVR implementation is still the default - Made a few fixes but otherwise no functional changes there.

For the OpenXR implementation I've hit a bit of a wall, keep managing crash SteamVR with very little debug information available. Bisecting indicates it's related to my Vulkan initialisation, but nothing specific past that.
I think what's happening is I was originally trying to re-use the Vulkan setup in vsg, but I'll actually need to intialise Vulkan through OpenXR (XR_KHR_vulkan_enable2) and attach the scene graph to that.

I'll probably need to re-implement some of the vsg::vk classes, implemeting things like an XRVulkanInstance and XRPhysicalDevice (OpenXR will require a specific device, so things like device selection logic would be bypassed).
Then with those I should be able to complete the an XRViewer or XRWindow class, which can be used in roughly the same way as the desktop equivalents, and how I did the OpenVR viewer.

Overall status is the OpenVR path is usable as a limited prototype, with the OpenXR rewrite progressing slowly as time permits. It's tricky but mainly just a steep learning curve.

Gareth Francis

unread,
May 15, 2022, 7:46:34 AM5/15/22
to vsg-users : VulkanSceneGraph Developer Discussion Group

A bit more OpenXR prototyping today, will need to change approach somewhat I think but the ideas are slowly forming.

A point for the core VSG - When rendering to OpenXR, and maybe for offscreen rendering there there's no VkSurface, and the swapchain / presentation approach is different (wait for OpenXR image, update, render, hand off frames, etc).
For vsgvr that mostly means I can't directly inherit from the Viewer classes, but perhaps the concept of 'a viewer' would need to be more generic at some point in the future.

As it stands I think I'm aiming for a vsg::Viewer equivalent that's similar enough to port applications easily, but internally is quite different, handles the XR render loop itself, plus action sets with some interface to the vsg input system.

(That's it for now, hoping progress in a few weeks)

Robert Osfield

unread,
May 15, 2022, 10:25:01 AM5/15/22
to vsg-...@googlegroups.com
Thanks for the update Gareth.

In principle one should be able to do HMD work using  vsg::Viewer, but.. and it's a big but... VR APIs don't always make this easy because they like to control more than they really need to/should do.

So, if it works better to do with vsg::Viewer to better work with the VR APIs way of doing things then so be it.  vsg::Viewer isn't meant to be the only way to set up rendering, just one way that covers most common usage cases.

The rendering in the VSG is done via the RecordAndSubmitTask/CommandGraph/RenderingGraph, this is mostly set up for you by vsg::Viewer for convenience purposes but there shouldn't be any reason why they can't just be used with another type of viewer/application level class.

In an ideal world I'd like VR APIs to just provide specs for the hardware. the position/attitude of the HMD, direction that the eyes are looking at, position/attitude of the controls etc. then let the rendering API decide how to create windows. do distortion correctly etc.

If this were possible then it would be relatively straightforward to create a vsg::Viewer configuration for a HMD.

Robert Osfield

unread,
May 16, 2022, 5:10:57 AM5/16/22
to vsg-users : VulkanSceneGraph Developer Discussion Group
Hi Gareth,

To help others find vsgvr I've added it the list of companion projects on VulkanSceneGraph/README.md.

Cheers,
Robert.

Gareth Francis

unread,
Jun 9, 2022, 7:12:18 AM6/9/22
to vsg-users : VulkanSceneGraph Developer Discussion Group

I've spent a decent chunk of the week on OpenXR integration, and this morning reached a milestone - The OpenXR code path is now on-par with the initial OpenVR codepath, with the exception of a desktop mirror window.
In the end a lot of rework was needed, and OpenXR does things quite differently. As such, I'm dropping the original OpenVR support (Preserved in the openvr-old branch, but I don't see any reason to use it).

The interface is somewhat similar to the vsg::Viewer / Window classes, for now there's quite a lot I just copied over to the vsgvr::OpenXRViewer class, that'll need some rework later on.
See https://github.com/geefr/vsgvr/blob/main/example_vr.cpp for a rundown of the basics. If anyone wants to play feel free, I'm reasonably confident it'll work on other headsets / runtimes, but don't have the hardware to test.

In terms of how the vsgvr::Viewer works - At the moment it's quite heavily tied into the OpenXR lifecycle, and I think has to be to an extent. That also extends to the application level render loop,
as there's cases where the render loop need to happen, but no rendering should be performed (or the runtime asks the application to go to sleep).
Later on I'm sure a certain amount of the viewew logic could be moved out - In the end it boils down to making vsg render to a series of ImageViews, and some different functions to call
when it comes to presentation.

Working:
* Presentation to a VR headset, tested so far with htc vive in steamvr, on windows and Ubuntu 20.04, NVidia GTX 3060
* Theoretically presentation to a single display / mobile-type setup should work - Might be fairly easy to get an android build going, once base vsg issues are resolved
* View tracking - There's an awkward matrix rotate/invert in there, but it's tracking well, better performance than it used to be
* Controller / XR device tracking - I've added initial support for tracking OpenXR controllers, headsets, etc, essentially a direct mapping of the OpenXR action system

Not working / next steps:
* Lots of bugfixes / cleanups, likely class renames
* Input and output actions (buttons, haptics, etc)
* Ability to move the 'player' - At the moment you're locked to {0,0,0}
* Example actions for moving / teleportation
* Desktop mirror window

And a small test video in the Sponza gltf model. Essentially the same as the old functionality, but this time through OpenXR..
https://youtu.be/0DY0IeodHyU

Robert Osfield

unread,
Jun 9, 2022, 7:46:56 AM6/9/22
to vsg-...@googlegroups.com
Congrats on reaching your milestone.  I'm still busy with the DynamicSceneGraph work so can't yet do a deeper review of the work or attempt to test it out.  I did a quick browse from the github repo via my browser and came away with the sense that there may be further opportunities on the VSG side to make replicating vsg::Viewer less essential.

The way things are evolving feels that OpenXR wants to dominate how the application is managed, rather than just taking on specific tasks and then getting out of the way.  This will make it awkward for app developers that have an application that normally works as a desktop app, or an IG, but has an option for rendering to a VR device.

For me, my ideal would be that users could create their applications largely agnostic of the display and input devices, then configure the viewer depending upon the display target - be that a single desktop window, full screen, multiple displays and graphics cards, or a VR, or a mixture of all of these.  In this idealized world the VR would be RenderGraph/View(s) graph that you add to the viewer to render to the HMD device.  For instance we have a vsg::createRenderGraphForView(..) method that helps generate the required setup to render a view, might it be possible to have a vsgXR::createRenderGraphForHMD(...) that can be dropped in in place of the usual vsg::createRenderGraphForView() call.

I don't know OpenXR so can't say how possible it might be to keep things as decoupled as this - from your description it sounds like this isn't possible, or making it possible is not obvious. I'm open to changes to core VSG to make it easier to extend things to make it easier to extend/integrate.  

--

As an aside, is it now possible to use modern HMD's under Linux again?  If so, which devices are working under Linux?

Cheers,
Robert.

Gareth Francis

unread,
Jun 9, 2022, 8:38:46 AM6/9/22
to vsg-users : VulkanSceneGraph Developer Discussion Group

I'd largely agree, there's definitely scope for rework and simplification, I don't think it would be easy to do but there are certainly common concepts to do that. Possibly the first thing is to abstract out the VkSurface, Swapchain, and Framebuffer setup in vsg, as there's no guarantee that the display would follow desktop conventions. (Or perhaps rendering into ImageViews, as part of a larger application would be similar to rendering to a headset).

Certainly the intention of OpenXR is to be tightly involved with the application lifecycle, but it could be abstracted away, and OpenXR is very flexible - At the moment I've exposed the lifecycle to the application, but if aspects like session management and the runtime going to sleep were hidden the application could just have the main render loop of [poll events, advance frame, render, present].
I have effectively got a createRenderGraphForHmd function in there somewhere, and it's mostly the same as createRenderGraphForView. The differences are all down to the swapchain/framebuffer setup, as with OpenXR there's a different approach to the 'present' part of the render loop.

The input and interaction side of things is very different however, so as a minimum an application would need to know that the rendering is to XR hardware, rather than the user clicking on UI buttons / etc.
The plan so far there is I'll make it possible to bind vsg nodes to tracked spaces (controllers, headset, body trackers, etc), and hopefully be able to forward input/output actions to the event system. Fortunately the action system in OpenXR is very flexible, so past knowing that the input button is '/user/hand/right/button_1' or similar I think that could be abstracted fairly well, and a default set of XR actions -> vsg events could be provided (e.g. Button 1 pressed, with an associated MatrixTransform with the pose information).

Overall I think it's very possible to integrate better, but working it out would take quite a lot of effort, best guess would be weeks rather than days, if working on it full time..
Some fairly extensive changes would be needed to vsg but I couldn't say what they are yet, past the observation that things currently assume the presence of a Surface and rendering to a desktop swapchain.
Assuming that rework isn't desired for vsg 1.0, it's probably best to consider vsgvr a vr system which uses vsg, and I'll try to steer things towards a better integration as I go.

---
In short yes, it's not quite as good as the Window experience, but there are options.
SteamVR - Supports the original HTC Vive and Valve Index, I think a few of the other HTC headsets work too but not sure.
There's a few people doing VR gaming on Linux, so this is probably the safest bet, the Index itself is a high end headset, and Valve are somewhat responsive to Linux-specific bugs.
The overall experience is roughly plug and play, assuming you have a Steam account and space to set up the base stations.

It looks like Monado is the open source equivalent, but I haven't looked into it yet. A quick glance at their feature list has a wider variety of potential hardware, but looks like early levels of support so far.
It's probably a fair assumption that anything from Oculus(facebook/meta) or Windows mixed reality won't work on Linux - I think Monado are working on the WMR / Hololens headsets though, those seem somewhat popular for industrial use cases.

Gareth Francis

unread,
Jun 10, 2022, 11:31:46 AM6/10/22
to vsg-users : VulkanSceneGraph Developer Discussion Group

A quick note on the testing front: Took a look at Monado on a spare laptop (Intel i915, Debian 11), and it looks pretty great as a dev/test environment. No other updates for today, but using Monado did identify a bug in my render loop.

Setup was the following (roughly):
* `apt install libopenxr-loader1 libopenxr-dev libopenxr1-monado xr-hardware libopenxr-utils openxr-layer-corevalidation openxr-layer-apidump monado-cli monado-gui`
* Installed vulkan libs, built vsg and vsgvr
* Started monado with the script below
* Started the vsgvr example

```
export QWERTY_ENABLE=1
export OXR_DEBUG_GUI=1
export XRT_COMPOSITOR_FORCE_XCB=1
monado-service
```

20220610-vsgvr-monado.png

Robert Osfield

unread,
Jun 10, 2022, 12:13:44 PM6/10/22
to vsg-...@googlegroups.com
Thanks for the Monado testing and suggestions, looks like a solid way for me to try it out.

Cheers,
Robert.
Reply all
Reply to author
Forward
0 new messages