vglserver_config setting missing after reboot

34 views
Skip to first unread message

Roland S

unread,
Feb 15, 2024, 9:26:15 AMFeb 15
to VirtualGL User Discussion/Support
Hi,

New server.
RedHat 8.9, VirtualGL 3.1, NVIDIA A2 with driver 545.23.08
Running XFCE desktop in ThinLinc VNC manager.

running
init 3
/opt/VirtualGL/bin/vglserver_config +egl +s +f 
init 5

vglrun /opt/VirtualGL/bin/glxspheres64
works fine, FPS 600
reboot
vglrun doesnt work

So vglserver_config will not survive a reboot
Whats missing?

- Roland

DRC

unread,
Feb 15, 2024, 9:36:00 AMFeb 15
to virtual...@googlegroups.com

'vglserver_config +egl' configures the system so that it can only use VirtualGL's EGL back end.  VirtualGL tries to use the GLX back end with a 3D X server by default.  That probably worked the first time because you were logged into the machine locally, and Display :0 was available, but it became unavailable when you rebooted.  If you only need to use VirtualGL's EGL back end, then pass '-d egl' or '-d {path_to_DRI_device}' to vglrun or set the VGL_DISPLAY environment variable to 'egl' or the path of a DRI device.  If you need to use both the GLX and EGL back ends, then run 'vglserver_config +glx' to configure the system.

--
You received this message because you are subscribed to the Google Groups "VirtualGL User Discussion/Support" group.
To unsubscribe from this group and stop receiving emails from it, send an email to virtualgl-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/virtualgl-users/f03ec033-1d84-41d4-8d7f-53880749ce4cn%40googlegroups.com.

Roland S

unread,
Feb 16, 2024, 2:35:23 AMFeb 16
to VirtualGL User Discussion/Support
Hi,

First, thanks for your help :-)
Second, I'm just a Unix admin that installs this for the users, I don't know that much of VirtualGL.
This is a MultiUser system so I need to set this globally for all users in the system. (hundreds)

I think this has changed since VGL 2.x as it works just fine there.
In ver 2.x I set it to GLX and get about 600 FPS on glxspheres
On this system with ver 3.1.1 I get 50FPS with GLX and 600FPS with EGL (running glxspheres)
I have no idea of the difference.
So the only reason I set it to EGL was that I got better FPS, that might be wrong... or wrong usage of vgl?

In V 2.x I just run
/opt/VirtualGL/bin/vglserver_config +glx +s +f
And then it works for all users automagically.

On this system if I set it to GLX it works if I manually set VGL_DISPLAY=$DISPLAY
It tries to use display :0.0 but as this is a VNC system my DISPLAY is maybe :11.0
This setting with VGL_DISPLAY fixes that issue.
I get 50 FPS

If I set it to EGL It works fine without setting VGL_DISPLAY
I get 600 FPS
But rebooting will change back to GLX?

I would like a setting where I don't need to tell hundreds of users to change their setup and set VGL_DISPLAY.
If that is the only solution, we just have to...
If I need to tell them all to start using vglrun -d egl to get it going that is also a hassle as there are hundreds of scripts that needs to be changed.
I could make a wrapper on vglrun to include -d egl but that is not a good solution...

Any inputs appreciated :-)

Cheers,
Roland

DRC

unread,
Feb 16, 2024, 9:42:38 AMFeb 16
to virtual...@googlegroups.com

Let me back up, define some terms, and give you my "elevator spiel", which may help you understand how VirtualGL is supposed to work.  Traditionally, Un*x OpenGL applications used the GLX API to connect OpenGL rendering with the X server.  The GLX API allows Un*x OpenGL applications to choose an X server visual with specific OpenGL rendering attributes, create an OpenGL context with that visual, create an OpenGL pixel buffer (Pbuffer) for off-screen rendering (in GPU memory), activate an OpenGL context in the current thread with a specific drawable (X window, X pixmap, or Pbuffer), swap the front and back render buffers, etc.  The more modern EGL API does all of that, and it can be used with multiple types of display systems (X11, Wayland, off-screen rendering directly to a DRI device, etc.)  Thus, many Un*x OpenGL applications are moving from the GLX API to the EGL API.  VirtualGL's basic job is to redirect OpenGL rendering away from the X server specified in DISPLAY, because we assume that that X server is either:

(a) remote, in which case sending OpenGL commands and data over a remote X connection has numerous problems, including lack of performance and lack of compatibility with OpenGL 1.5 and later

or

(b) has software (unaccelerated) OpenGL, which is the case with Xvnc and other "X proxies."  (An X proxy is a virtual X server that maintains a framebuffer in main memory and compresses/transports images from that framebuffer to remote clients.)

VirtualGL intercepts either GLX or EGL/X11 function calls from Un*x OpenGL applications using, respectively, its GLX and EGL/X11 "front ends" (AKA "interposers.")  VirtualGL emulates those function calls in such a way that the calling application thinks that it is rendering into an OpenGL window, when in fact it is rendering into a Pbuffer in GPU memory that VirtualGL manages for each OpenGL window.  Because OpenGL rendering is redirected into GPU-resident off-screen Pbuffers, multiple users can simultaneously share a single GPU (whereas on-screen rendering is inherently a single-user affair, since there is only one screen.)  Because Xvnc and other X proxies can't use GPU memory for their framebuffers, all 3D rendering that occurs in those X proxies is unaccelerated by default.  VirtualGL works around that limitation by redirecting OpenGL rendering away from the X proxy and into a Pbuffer in GPU memory, then reading back the OpenGL-rendered pixels and compositing them into the application's X window in the X proxy.  That effectively creates a multi-user multi-session GPU-accelerated remote desktop environment.

In order to create and manage GPU-resident Pbuffers, VirtualGL needs to access a GPU.  VirtualGL's "GLX back end" accesses the GPU through an X server attached to the GPU (the "3D X server") using the GLX API, whereas VirtualGL's "EGL back end" accesses the GPU directly through a DRI device.  The GLX front end can either use the GLX or EGL back end.  The EGL/X11 front end can only use the EGL back end.

VGL_DISPLAY should never point to the VNC server.  VGL_DISPLAY specifies where 3D rendering should be redirected to, so it should either point to a 3D X server (to use the GLX back end) or a DRI device (to use the EGL back end.)  That's why you are only getting 50 fps.  By pointing VGL_DISPLAY to the VNC server's display, you are redirecting OpenGL rendering from the VNC server back to the VNC server, which does nothing but introduce overhead without actually enabling GPU acceleration.

    vglserver_config +glx +s +f

should work the same way with VGL 3.1.x as it did with VGL 2.x.  The default value of VGL_DISPLAY (:0.0) should still be correct for that configuration.  However, be aware that modern versions of GDM use a different X server for the greeter (the login screen) and the session.  If GDM is sitting at the login prompt, the X server it uses will be listening on Display :0.  When you log in, the X server it uses will be listening on Display :1.  Thus, you can't log in locally while people are using VirtualGL on the system.  You have to leave the display manager sitting at the login prompt.  (That has always been the case, actually.  Even in the pre-Wayland days when the display manager always used the same X server for the greeter and the session, it still usually reset the X server when users logged out, which caused any applications running with VirtualGL to abort.)

The changes that vglserver_config makes to the system are persistent and should survive a reboot.

If any of that doesn't work as I described it, then please let me know specifically what isn't working.

Note that, when using the GLX back end, you usually won't have to specify '-d egl', but you will have to specify '-d egl' when using applications that support only the EGL/X11 API.  (Note to self-- maybe VirtualGL should automatically enable the EGL back end when using the EGL/X11 front end, since that is the only valid combination.)

I eventually hope to enable the EGL back end by default, thus deprecating the use of 3D X servers, but right now there are still some compatibility issues with VirtualGL's GLX-to-EGL emulation.

Reply all
Reply to author
Forward
0 new messages