I'd largely agree, there's definitely scope for rework and simplification, I don't think it would be easy to do but there are certainly common concepts to do that. Possibly the first thing is to abstract out the VkSurface, Swapchain, and Framebuffer setup in vsg, as there's no guarantee that the display would follow desktop conventions. (Or perhaps rendering into ImageViews, as part of a larger application would be similar to rendering to a headset).
Certainly the intention of OpenXR is to be tightly involved with the application lifecycle, but it could be abstracted away, and OpenXR is very flexible - At the moment I've exposed the lifecycle to the application, but if aspects like session management and the runtime going to sleep were hidden the application could just have the main render loop of [poll events, advance frame, render, present].
I have effectively got a createRenderGraphForHmd function in there somewhere, and it's mostly the same as createRenderGraphForView. The differences are all down to the swapchain/framebuffer setup, as with OpenXR there's a different approach to the 'present' part of the render loop.
The input and interaction side of things is very different however, so as a minimum an application would need to know that the rendering is to XR hardware, rather than the user clicking on UI buttons / etc.
The plan so far there is I'll make it possible to bind vsg nodes to tracked spaces (controllers, headset, body trackers, etc), and hopefully be able to forward input/output actions to the event system. Fortunately the action system in OpenXR is very flexible, so past knowing that the input button is '/user/hand/right/button_1' or similar I think that could be abstracted fairly well, and a default set of XR actions -> vsg events could be provided (e.g. Button 1 pressed, with an associated MatrixTransform with the pose information).
Overall I think it's very possible to integrate better, but working it out would take quite a lot of effort, best guess would be weeks rather than days, if working on it full time..
Some fairly extensive changes would be needed to vsg but I couldn't say what they are yet, past the observation that things currently assume the presence of a Surface and rendering to a desktop swapchain.
Assuming that rework isn't desired for vsg 1.0, it's probably best to consider vsgvr a vr system which uses vsg, and I'll try to steer things towards a better integration as I go.
---
In short yes, it's not quite as good as the Window experience, but there are options.
SteamVR - Supports the original HTC Vive and Valve Index, I think a few of the other HTC headsets work too but not sure.
There's a few people doing VR gaming on Linux, so this is probably the safest bet, the Index itself is a high end headset, and Valve are somewhat responsive to Linux-specific bugs.
The overall experience is roughly plug and play, assuming you have a Steam account and space to set up the base stations.
It looks like Monado is the open source equivalent, but I haven't looked into it yet. A quick glance at their feature list has a wider variety of potential hardware, but looks like early levels of support so far.
It's probably a fair assumption that anything from Oculus(facebook/meta) or Windows mixed reality won't work on Linux - I think Monado are working on the WMR / Hololens headsets though, those seem somewhat popular for industrial use cases.