OpenXR interop (how to forward native handles?)

25 views
Skip to first unread message

Mick Pearson

unread,
May 26, 2021, 1:36:13 PMMay 26
to angleproject
I'm exploring using ANGLE but am a little unsure about one requirement I have for it, that's OpenXR, which itself I'm new to, so I don't know off hand if ANGLE is fit for OpenXR or not. I'd appreciate advice about this, but one thing is OpenXR needs access to handles for the "device" and "swap chain" (2D textures or buffers) that I know of.

Poking around ANGLE I'm unsure if it's designed with this in mind. There are some d3d "extensions" that look like they could do the trick, but I don't see anything generic, and I'd kind of prefer to have opaque pointers, etc. so not to need to include headers or link to the underlying API.

I did occur to me that the others (besides d3d) might be able to query their "current" context to get these things. It doesn't look like ANGLE itself has a public API for this, although it seems to have something called Features under include/platform. I wonder if the right idea is just to write code against ANGLE's src folder? What do other projects do?

Geoff Lang

unread,
May 31, 2021, 11:58:20 AMMay 31
to ho...@swordofmoonlight.net, angleproject
ANGLE doesn't have any OpenXR integration. We prefer to provide extensions with the low-level primitives so unless someone has written a wrapper, you will have to integrate at that level.

If there is any platform/API you're unsure of how to integrate, we can probably point you to the right extension.

--
You received this message because you are subscribed to the Google Groups "angleproject" group.
To unsubscribe from this group and stop receiving emails from it, send an email to angleproject...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/angleproject/8ac14c49-91cc-41f0-9301-fe02a5f9f4b8n%40googlegroups.com.

Mick Pearson

unread,
May 31, 2021, 9:03:17 PMMay 31
to angleproject
If you're curious or don't know, there's an example of OpenGL+OpenXR here (https://github.com/ReliaSolve/OpenXR-OpenGL-Example/blob/main/main.cpp) that I was just looking at it. From what I know so far OpenXR begins with XrSessionCreateInfo that is a structure tailored to the underlying graphics API and for Direct3D it looks like it takes a device pointer. In this OpenGL case it looks like it takes arguments for creating its own context, bypassing WGL/EGL, etc. to my surprise. (In that case ANGLE would have to piggyback onto its context.) Direct3D seems simpler in this regard. Then the second step I know of is xrEnumerateSwapchainImages. This pulls back native buffers that OpenXR creates by itself. In that case ANGLE would unlikely know about these handles and would have to patch them into its internal representation. It's possible it could function ignorant of this (I'm speculating) but the next step is per frame XrSwapchainImageAcquireInfo is used to get a swap chain index into those, but the index is really used to get an XrSwapchainImageOpenGLKHR or XrSwapchainImageD3D11KHR. These have handles that may be identical to xrEnumerateSwapchainImages (I don't know) that the apps draw themselves into.

In the OpenGL example this handle is passed to glFramebufferTexture2D and a depth buffer is allocated by the app independently and it to uses  glFramebufferTexture2D. In the Direct3D case the interface pointer is passed to CreateRenderTargetView.

[Personally I'm a little skeptical of OpenXr but this could be premature on my part, being inexperienced with it. I feel strong-armed into using it. I just have a feeling its specification is too controlling and too expensive (compromising) on performance (I worry some screen effects won't work and vendors may feel they're in position to dictate screen effects to artistic apps) and that I will at some point find myself trying to use its "headless" extension (headless is kind of misnomer for head sets) to bypass its rendering interface. But that won't be possible unless enough information is available for a given device to draw it correctly, assuming the picture can be sent to the device like a regular display, and hoping that with the headless mode "reprojection" doesn't cut out. IOW I have to program an OpenXR path to start working with high end sets. There's an open source Monado project for Linux that might have to figure this out too. But I think apart from rendering OpenXR will be fine.]
Reply all
Reply to author
Forward
0 new messages