[osg-users] Specifying the GL context version to request

942 views
Skip to first unread message

Sandro Mani

unread,
Aug 21, 2017, 7:42:20 AM8/21/17
to osg-...@lists.openscenegraph.org
Hi

To get OpenSceneGraph with GL3.2+ working on GLX, in pull request 302
[1] I've proposed adding support for specifying the GL version, profile
and context flags on GLX via the traits fields

std::string glContextVersion;
unsigned int glContextFlags;
unsigned int glContextProfileMask;

So far these were only honored by GraphicsWindowWin32.

This work has now been merged [2].

There is however still one open issue, namely how to properly and
reliably get the context you ask for. There are the methods
setGLContext{Version,Flags,ProfileMask} in DisplaySettings [3] which
seems the appropriate way to do so, but the display settings (and hence
these settings) are only honored by the Traits constructor if a
DisplaySettings instance is explicitly specified [4]. However, as far as
I can see, this seldom is done (just grep for "new
osg::GraphicsContext::Traits" throughout the codebase and you can see
that often the traits are constructed without a DisplaySettings instance).

My suggestion would be to always fall back to the global DisplaySettings
singleton [5] if none is passed to the Traits constructor, as proposed
here [6].

This way, writing once i.e.

osg::DisplaySettings::instance()->setGLContextVersion("4.0");
osg::DisplaySettings::instance()->setGLContextProfileMask(0x1);

at beginning of your program will ensure you'll get the context you
actually asked for.

Any comments on this proposal?

Thanks
Sandro


[1] https://github.com/openscenegraph/OpenSceneGraph/pull/302

[2]
https://github.com/openscenegraph/OpenSceneGraph/commit/8926f0e9c253a1781e6cdbde96ec467324bb105c

[3]
https://github.com/openscenegraph/OpenSceneGraph/blob/master/include/osg/DisplaySettings#L279

[4]
https://github.com/openscenegraph/OpenSceneGraph/blob/master/src/osg/GraphicsContext.cpp#L249

[5]
https://github.com/openscenegraph/OpenSceneGraph/blob/master/include/osg/DisplaySettings#L38

[6]
https://github.com/openscenegraph/OpenSceneGraph/pull/302#issuecomment-323189245

_______________________________________________
osg-users mailing list
osg-...@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Robert Osfield

unread,
Aug 21, 2017, 8:16:18 AM8/21/17
to OpenSceneGraph Users
Hi Sandro,

I'm torn on whether to fallback to using the DisplaySettings:instance() when DisplaySettings isn't explicitly specified.  Defaulting to using DisplaySettings::instance() has the advantage of forcing a particular graphics context version and other parameters from one place.  Conversely it could be a real issue for some applications that assume that the values will just be straight defaults without picking anything else up implicitly - we could inadvertently break their applications.  These changes in behaviour could be quite difficult to trace as the application would just compile and link as before, but now has different behaviour.

Given this I we'll need to be really careful about making the default.  I don't yet have any clear direction on it.  Will need to think some more.

Robert.

Sandro Mani

unread,
Aug 21, 2017, 8:21:59 AM8/21/17
to OpenSceneGraph Users
Hi Robert

A less invasive alternative could be to introduce a new singleton called OpenGLSettings or similar which contains just the three settings glContextVersion, glContextFlags, glContextProfileMask.

Best
Sandro

Robert Osfield

unread,
Aug 21, 2017, 8:59:57 AM8/21/17
to OpenSceneGraph Users
Hi Sandro,

On 21 August 2017 at 13:21, Sandro Mani <manis...@gmail.com> wrote:
A less invasive alternative could be to introduce a new singleton called OpenGLSettings or similar which contains just the three settings glContextVersion, glContextFlags, glContextProfileMask.

I don't see how this is different, if you are setting setting a default that users might not realize is now being set differently, it's just the same issue expressed in a slightly different way.

--

Currently when the osgViewer::Viewer is constructed and run() is called without the viewer explictly setting up the graphics context it's fallback to using the setUpViewAcrossAllScreens() and within this implementation when it sets up the Traits it passes the DisplaySetting assigned to the Viewer or DisplaySetting::instance() will be used if none is assigned.  This fallback is functionally the same as you are after and this already will be working.

For places where the windows are being explicitly set up then they will be creating a Traits themselves and at this point I'm happy for them to need to explicitly specify the settings or pass in the DisplaySettings they want to use as defaults.

--

The only area that I think we might need to tighten up is with the defaults if the user explicitly builds the OSG against GL3 without any compatibility with GL1/GL2, here it would make sense for the GraphicsContext::Traits::glVersion to be set to 3.0 rather than 1.0 that it defaults to right now.  One could push this into DisplaySettings's defaults and then force GraphicsContext::Traits to fallback to this by default, but personally I'm not happy with this in case it brings in other defaults that aren't intended.


Robert
 



 

Sandro Mani

unread,
Aug 21, 2017, 9:36:09 AM8/21/17
to OpenSceneGraph Users



On 21.08.2017 14:59, Robert Osfield wrote:
Hi Sandro,

On 21 August 2017 at 13:21, Sandro Mani <manis...@gmail.com> wrote:
A less invasive alternative could be to introduce a new singleton called OpenGLSettings or similar which contains just the three settings glContextVersion, glContextFlags, glContextProfileMask.

I don't see how this is different, if you are setting setting a default that users might not realize is now being set differently, it's just the same issue expressed in a slightly different way.
Mh yeah my quick reaction was just that as opposed to other fields in the DisplaySettings, the desired OpenGL context version is likely to be a constant thoughout the lifetime of the application.


--

Currently when the osgViewer::Viewer is constructed and run() is called without the viewer explictly setting up the graphics context it's fallback to using the setUpViewAcrossAllScreens() and within this implementation when it sets up the Traits it passes the DisplaySetting assigned to the Viewer or DisplaySetting::instance() will be used if none is assigned.  This fallback is functionally the same as you are after and this already will be working.

For places where the windows are being explicitly set up then they will be creating a Traits themselves and at this point I'm happy for them to need to explicitly specify the settings or pass in the DisplaySettings they want to use as defaults.
Admittedly, I'm not very familiar with OpenSceneGraph, my main motivation digging into this is just to get osgEarth working again under Linux, as it has been broken for the past couple of years ever since it started requiring GLSL330+.

That said, I found it confusing and it took me some time to figure out how to actually set the glContextVersion etc. The DisplaySettings had no effect, and only after digging into the code I realized that it was because osgEarth creates its version without specifying a DisplaySettings instance [1]. After becoming more familiar with the internal workings, I suppose this can be classified as a bug in osgEarth, but I feel that a user new to OSG could expect that the settings specified in the global DisplaySettings singleton actually have an effect on the Traits if it is constructed without a specific ds instance passed.


Just my thoughts.

Sandro

[1] https://github.com/gwaldron/osgearth/blob/master/src/osgEarth/Capabilities.cpp

Robert Osfield

unread,
Aug 21, 2017, 9:47:26 AM8/21/17
to OpenSceneGraph Users
Hi Sandro,

What hardware and drivers are you using?

Are the osgEarth team aware of the issues you've had?

Robert.

Sandro Mani

unread,
Aug 21, 2017, 10:22:00 AM8/21/17
to OpenSceneGraph Users

Hi Robert


On 21.08.2017 15:47, Robert Osfield wrote:
Hi Sandro,

What hardware and drivers are you using?
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 17.2.0-rc4
OpenGL core profile shading language version string: 4.50

Are the osgEarth team aware of the issues you've had?

I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).

Sandro

[1] http://forum.osgearth.org/osgEarth-2-8-on-mesa-intel-td7590671.html

Robert Osfield

unread,
Aug 21, 2017, 11:08:56 AM8/21/17
to OpenSceneGraph Users
Hi Sandro,

On 21 August 2017 at 15:21, Sandro Mani <manis...@gmail.com> wrote:
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 17.2.0-rc4
OpenGL core profile shading language version string: 4.50

Is there a reason why the driver isn't just directly supporting the GL features that osgEarth is using?  Is the Intel/Mesa driver limiting features in some way?

Have you tried on an NVIdia or AMD system?

FYI, I'm using NVidia under Kubunty 16.04 as my main build system and routinely mix latest GL features with just creating a normal graphics context, 


Are the osgEarth team aware of the issues you've had?

I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).

My guess is that they probably haven't used Linix+Intel with the drivers that you are using so haven't come across the issue.  Real-time graphics under Linux has tended to mostly done with NVidia as Intel and AMD have had sub-standard GL drivers for looooong time.

I suspect it's not really a case of the osgEarth team doing something wrong, but the Linux+Intel drivers adding a new constraint for keeping things working sweetly.

I'm starting to wonder if you aren't hitting upon the same issues that Apple OSG users have had with Apple's decision to only support OpenGL 3+ features on a graphics context without compatibility.  While it seems a uncontroversial decision at first glance it's ended up being a real pain for OpenGL users needing to maintain long lived applications that happen to rely upon both newer features when available as well as old fixed function pipeline features.  The Apple approach is fine for clean room application written recently such as new games but crap for the many companies that develop software that has a useful life that's over a decade long.

Robert.
 

Sandro Mani

unread,
Aug 21, 2017, 11:27:19 AM8/21/17
to OpenSceneGraph Users

Hi Robert

On 21.08.2017 17:08, Robert Osfield wrote:
Hi Sandro,

On 21 August 2017 at 15:21, Sandro Mani <manis...@gmail.com> wrote:
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 17.2.0-rc4
OpenGL core profile shading language version string: 4.50

Is there a reason why the driver isn't just directly supporting the GL features that osgEarth is using?  Is the Intel/Mesa driver limiting features in some way?
Someone please correct me if I'm using the wrong terminology, but as far as I understand mesa (possibly restricted to the intel mesa drivers, but possibly also others such as ati and nouveau) only exposes GL3.2+ functionality through the corresponding core profile. Compatibility profile is only implemented up to GL3.0 on my system:

    Vendor: Intel Open Source Technology Center (0x8086)
    Device: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)  (0x191b)
    Version: 17.2.0
    Accelerated: yes
    Video memory: 3072MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 4.5
    Max compat profile version: 3.0
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.2




Have you tried on an NVIdia or AMD system?
No, but if I'm not mistaken they have a full(?) compatibility profile.


FYI, I'm using NVidia under Kubunty 16.04 as my main build system and routinely mix latest GL features with just creating a normal graphics context, 


Are the osgEarth team aware of the issues you've had?

I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).

My guess is that they probably haven't used Linix+Intel with the drivers that you are using so haven't come across the issue.  Real-time graphics under Linux has tended to mostly done with NVidia as Intel and AMD have had sub-standard GL drivers for looooong time.

I suspect it's not really a case of the osgEarth team doing something wrong, but the Linux+Intel drivers adding a new constraint for keeping things working sweetly.
I don't think it's adding a constraint, rather than the mesa developers focusing first on adding support for extensions required to expose the newest possible version in the core profile, and then working on the compatibility profile as time allows.


I'm starting to wonder if you aren't hitting upon the same issues that Apple OSG users have had with Apple's decision to only support OpenGL 3+ features on a graphics context without compatibility.  While it seems a uncontroversial decision at first glance it's ended up being a real pain for OpenGL users needing to maintain long lived applications that happen to rely upon both newer features when available as well as old fixed function pipeline features.  The Apple approach is fine for clean room application written recently such as new games but crap for the many companies that develop software that has a useful life that's over a decade long.
It is likely that the issues are of similar nature yes. But since, with the changes from the PR + the proposed DisplaySettings change, I'm able to run osgEarth fine on this system, I'd say the situation is not that bad here, it's just a matter of clarifying how the profile version should be set.

Sandro

Robert Osfield

unread,
Aug 21, 2017, 11:43:41 AM8/21/17
to OpenSceneGraph Users
Hi Sandro,

On 21 August 2017 at 16:27, Sandro Mani <manis...@gmail.com> wrote:
Someone please correct me if I'm using the wrong terminology, but as far as I understand mesa (possibly restricted to the intel mesa drivers, but possibly also others such as ati and nouveau) only exposes GL3.2+ functionality through the corresponding core profile. Compatibility profile is only implemented up to GL3.0 on my system:

    Vendor: Intel Open Source Technology Center (0x8086)
    Device: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)  (0x191b)
    Version: 17.2.0
    Accelerated: yes
    Video memory: 3072MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 4.5
    Max compat profile version: 3.0
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.2

What GL version is osgEarth now depending upon?
 
Have you tried on an NVIdia or AMD system?
No, but if I'm not mistaken they have a full(?) compatibility profile.

NVidia have been pretty reliable on maintaining compatibility, I can't comment on AMD as I haven't used AMD for a number of years.

I don't have much time for chasing up all the various permutations of hardwsare, drivers and OS's personally, but given the breadth of the OSG community we generally can cover most things, even if it isn't right away.  When someone starts using a less common hardware/OS combination then we have to rely upon them to give us feedback - exactly as you are doing right now :-)
 

FYI, I'm using NVidia under Kubunty 16.04 as my main build system and routinely mix latest GL features with just creating a normal graphics context, 


Are the osgEarth team aware of the issues you've had?

I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).

My guess is that they probably haven't used Linix+Intel with the drivers that you are using so haven't come across the issue.  Real-time graphics under Linux has tended to mostly done with NVidia as Intel and AMD have had sub-standard GL drivers for looooong time.

I suspect it's not really a case of the osgEarth team doing something wrong, but the Linux+Intel drivers adding a new constraint for keeping things working sweetly.
I don't think it's adding a constraint, rather than the mesa developers focusing first on adding support for extensions required to expose the newest possible version in the core profile, and then working on the compatibility profile as time allows.

From an end users perspectives lack of compatibility profile is a constraint, if it weren't you wouldn't have had any problems.  From what you describe it may well be a temporary constraint.

 

I'm starting to wonder if you aren't hitting upon the same issues that Apple OSG users have had with Apple's decision to only support OpenGL 3+ features on a graphics context without compatibility.  While it seems a uncontroversial decision at first glance it's ended up being a real pain for OpenGL users needing to maintain long lived applications that happen to rely upon both newer features when available as well as old fixed function pipeline features.  The Apple approach is fine for clean room application written recently such as new games but crap for the many companies that develop software that has a useful life that's over a decade long.
It is likely that the issues are of similar nature yes. But since, with the changes from the PR + the proposed DisplaySettings change, I'm able to run osgEarth fine on this system, I'd say the situation is not that bad here, it's just a matter of clarifying how the profile version should be set.

Are you building the OSG with defaults?  Or are you building the OSG for just GL3?

What viewer are using with osgEarth?  osgEarth itself is typically used as NodeKit with end user application creating their own viewer and with it graphics contexts.  This means even if osgEarth's own example programs changed the way they create graphics context you'd end up with issues.

Have you tried running osgviewer with the osgEarth files?  osgviewer by default uses the ViewAcrossAllScreens() configuration which will pass the DisplaySettings::instance() when creating the graphics context.    osgviewer also has the command line parameters (run osgivewer --help):

 --gl-flags <mask> Set the hint of which GL flags projfile mask to use when
                    creating graphics contexts.
  --gl-profile-mask <mask>                                                     
                    Set the hint of which GL context profile mask to use when
                    creating graphics contexts.
  --gl-version <major.minor>                                                   
                    Set the hint of which GL version to use when creating
                    graphics contexts.

You can also set the env vars (osgviewer --help-env):

  OSG_GL_CONTEXT_FLAGS <uint>                                                  
                    Set the hint for the GL context flags to use when creating
                    contexts.
  OSG_GL_CONTEXT_PROFILE_MASK <uint>                                           
                    Set the hint for the GL context profile mask to use when
                    creating contexts.
  OSG_GL_CONTEXT_VERSION <major.minor>                                         
                    Set the hint for the GL version to create contexts for.

Could you test these to see if it works with OSG master, without any of your extra local modifications.  If they do work could you post the settings you used.

Feedback on this might help guide what osgEarth example application might be able to do better.

Robert.

Sandro Mani

unread,
Aug 21, 2017, 12:22:04 PM8/21/17
to OpenSceneGraph Users

Hi Robert


On 21.08.2017 17:43, Robert Osfield wrote:
What GL version is osgEarth now depending upon?
A quick grep through the codebase shows shaders depending on GLSL up to 430.


From an end users perspectives lack of compatibility profile is a constraint, if it weren't you wouldn't have had any problems.  From what you describe it may well be a temporary constraint.
I can't speak for the mesa developers, though search a bit one can read various statements here and there, including "A long time ago, a decision was made for Mesa not to increase the advertised compatibility profile version." (https://bugs.freedesktop.org/show_bug.cgi?id=96449#c5), so does not sound too temporary.


Are you building the OSG with defaults?  Or are you building the OSG for just GL3?
%cmake -DBUILD_OSG_EXAMPLES=ON -DBUILD_DOCUMENTATION=ON -DOSG_GL1_AVAILABLE=ON -DOSG_GL2_AVAILABLE=ON -DOPENGL_PROFILE=GLCORE -DOPENGL_HEADER1="#include <GL/gl.h>"


What viewer are using with osgEarth?  osgEarth itself is typically used as NodeKit with end user application creating their own viewer and with it graphics contexts.  This means even if osgEarth's own example programs changed the way they create graphics context you'd end up with issues.
Running with osgearth_viewer, I get two calls to osgViewer::GraphicsWindowX11::init. The first one where, I assume, it tries to determine the capabilities of hardware/driver in use, and this calls Traits without a DisplaySettings instance:

#0  osgViewer::GraphicsWindowX11::init (this=this@entry=0x1004ad330)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/GraphicsWindowX11.cpp:915
#1  0x00007f3e23ae4b50 in osgViewer::GraphicsWindowX11::GraphicsWindowX11 (traits=0x1003760d0, this=<optimized out>)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/include/osgViewer/api/X11/GraphicsWindowX11:56
#2  X11WindowingSystemInterface::createGraphicsContext (this=<optimized out>, traits=0x1003760d0)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/GraphicsWindowX11.cpp:2269
#3  0x00007f3e241e81b7 in osg::GraphicsContext::createGraphicsContext (traits=traits@entry=0x1003760d0)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osg/GraphicsContext.cpp:128
#4  0x00007f3e2350c256 in MyGraphicsContext::MyGraphicsContext (this=<synthetic pointer>)
    at /usr/src/debug/osgearth-2.9-0.2.git43d4ba5.fc27.x86_64/src/osgEarth/Capabilities.cpp:76
#5  osgEarth::Capabilities::Capabilities (this=0x100390f30)
    at /usr/src/debug/osgearth-2.9-0.2.git43d4ba5.fc27.x86_64/src/osgEarth/Capabilities.cpp:162
#6  0x00007f3e236897a5 in osgEarth::Registry::initCapabilities (this=0x1002a7af0)
    at /usr/src/debug/osgearth-2.9-0.2.git43d4ba5.fc27.x86_64/src/osgEarth/Registry.cpp:484
#7  0x00007f3e23689829 in osgEarth::Registry::getCapabilities (this=0x1002a7af0)
    at /usr/src/debug/osgearth-2.9-0.2.git43d4ba5.fc27.x86_64/src/osgEarth/Registry.cpp:468
#8  0x00007f3dfa209486 in osgEarth::Registry::capabilities ()
    at /usr/src/debug/osgearth-2.9-0.2.git43d4ba5.fc27.x86_64/src/osgEarth/Registry:149
#9  osgEarth::Drivers::MPTerrainEngine::MPTerrainEngineNode::MPTerrainEngineNode (this=0x100a1ac60)
    at /usr/src/debug/osgearth-2.9-0.2.git43d4ba5.fc27.x86_64/src/osgEarthDrivers/engine_mp/MPTerrainEngineNode.cpp:202
[...]

Then, a second time here:

#0  osgViewer::GraphicsWindowX11::init (this=this@entry=0x100676840)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/GraphicsWindowX11.cpp:915
#1  0x00007f3e23ae4b50 in osgViewer::GraphicsWindowX11::GraphicsWindowX11 (traits=0x10052c100, this=<optimized out>)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/include/osgViewer/api/X11/GraphicsWindowX11:56
#2  X11WindowingSystemInterface::createGraphicsContext (this=<optimized out>, traits=0x10052c100)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/GraphicsWindowX11.cpp:2269
#3  0x00007f3e241e81b7 in osg::GraphicsContext::createGraphicsContext (traits=traits@entry=0x10052c100)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osg/GraphicsContext.cpp:128
#4  0x00007f3e23a7ab62 in osgViewer::SingleWindow::configure (this=0x100a2a370, view=...)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/config/SingleWindow.cpp:72
#5  0x00007f3e23a7a80a in osgViewer::SingleScreen::configure (this=<optimized out>, view=...)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/config/SingleScreen.cpp:29
#6  0x00007f3e23a75207 in osgViewer::AcrossAllScreens::configure (this=<optimized out>, view=...)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/config/AcrossAllScreens.cpp:48
#7  0x00007f3e23ab79c2 in osgViewer::View::apply (this=0x7fffffffdb00, config=0x1002a3770)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/View.cpp:456
#8  0x00007f3e23ab7b00 in osgViewer::View::setUpViewAcrossAllScreens (this=<optimized out>)
    at /usr/src/debug/OpenSceneGraph-3.5.7-0.2.gitbfbaeca.fc27.x86_64/src/osgViewer/View.cpp:463
[...]

using ViewAcrossAllScreens as you anticipated and hence the default DisplaySettings. Since by default a "1.0" context is created, the result is:

FRAGMENT glCompileShader "main(fragment)" FAILED
FRAGMENT Shader "main(fragment)" infolog:
0:1(10): error: GLSL 3.30 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, 3.00 ES, 3.10 ES, and 3.20 ES

etc.

If I add


osg::DisplaySettings::instance()->setGLContextVersion( "4.0" );
osg::DisplaySettings::instance()->setGLContextProfileMask( 0x1 );

at the beginning of a tweaked osgearth_viewer.cpp@main, things work (though possibly osgEarth::Capabilities::Capabilities will have incorrectly determined some things since a "1.0" context was created there). The same effect can indeed be achieved by using the environment variables you mention.


If I use osgearth_qt_simple, the GL context created by Qt is used, which I can set up to use the desired version and profile mask, at which point things will also work with that viewer.

Best
Sandro

Robert Osfield

unread,
Aug 21, 2017, 12:34:21 PM8/21/17
to OpenSceneGraph Users
Hi Sandro,

On 21 August 2017 at 17:21, Sandro Mani <manis...@gmail.com> wrote:
Are you building the OSG with defaults?  Or are you building the OSG for just GL3?
%cmake -DBUILD_OSG_EXAMPLES=ON -DBUILD_DOCUMENTATION=ON -DOSG_GL1_AVAILABLE=ON -DOSG_GL2_AVAILABLE=ON -DOPENGL_PROFILE=GLCORE -DOPENGL_HEADER1="#include <GL/gl.h>"

That's a bit of ugly mix.  Things should work with just:

cmake .
make -j 4   # 4 is number of cores available

When you using the OPENGL_PROFILE it'll set the OSG_*_AVAILABLE options for you, but in general build with defaults unless you specifically need a particular feature set. 
Try using the OSG_GL_CONTEXT_VERSION and OSG_GL_CONTEXT_PROFILE_MASK env vars instead of the above DisplaySettings::instance() code i,e,

export OSG_GL_CONTEXT_VERSION=4.0
export OSG_GL_CONTEXT_PROFILE_MASK=1
 
This will still require the merged changes to OSG master, but as long as the ViewAcrossAllScreens is used the DisplaySettings::instance() should automatically initialize with the correct values.

Robert.

Sandro Mani

unread,
Aug 21, 2017, 12:41:15 PM8/21/17
to OpenSceneGraph Users



On 21.08.2017 18:34, Robert Osfield wrote:
Hi Sandro,

On 21 August 2017 at 17:21, Sandro Mani <manis...@gmail.com> wrote:
Are you building the OSG with defaults?  Or are you building the OSG for just GL3?
%cmake -DBUILD_OSG_EXAMPLES=ON -DBUILD_DOCUMENTATION=ON -DOSG_GL1_AVAILABLE=ON -DOSG_GL2_AVAILABLE=ON -DOPENGL_PROFILE=GLCORE -DOPENGL_HEADER1="#include <GL/gl.h>"

That's a bit of ugly mix. 
Agreed, I didn't put much though into it, I just checked that I hit the OSG_GL3_FEATURES when doing the work on the context.

Things should work with just:

cmake .
make -j 4   # 4 is number of cores available

When you using the OPENGL_PROFILE it'll set the OSG_*_AVAILABLE options for you, but in general build with defaults unless you specifically need a particular feature set. 
Well, in this case I needed it though, right? I mean, OPENGL_PROFILE defaults to "GL2".
Yes as mentioned this also works. But the open issue for me still remains the first context created by osgEarth::Capabilities::
Capabilities (see first of the stack traces I posted above). In my view either this is a bug in osgEarth that it creates the traits without honouring the desired GL version, or OSG should otherwise ensure the traits contain the overridden GL version.

Sandro

Robert Osfield

unread,
Aug 21, 2017, 1:06:16 PM8/21/17
to OpenSceneGraph Users
HI Sandro,

On 21 August 2017 at 17:41, Sandro Mani <manis...@gmail.com> wrote:
Agreed, I didn't put much though into it, I just checked that I hit the OSG_GL3_FEATURES when doing the work on the context.

The OSG is able to detect and use features at runtime, you don't actually need to explictly set OSG_GL3_FEATURES.  It's only when you start use a core profile do you need to be selective about the OSG_GL3_FEATURES, but these settings will be set appropriate if you choose OPENGL_PROFILE to be GL3 or GLCORE.
 
Things should work with just:

cmake .
make -j 4   # 4 is number of cores available

When you using the OPENGL_PROFILE it'll set the OSG_*_AVAILABLE options for you, but in general build with defaults unless you specifically need a particular feature set. 
Well, in this case I needed it though, right? I mean, OPENGL_PROFILE defaults to "GL2".

You should only need to OPENGL_PROFILE something other than defaults if you specifically want just a non compatible graphics context.  It sounds like the Intel driver might require OPENGL_PROFILE if you want to enable the latest GL features, something you won't need under NVIidia and possible AMD. Try using the OSG_GL_CONTEXT_VERSION and OSG_GL_CONTEXT_PROFILE_MASK env vars instead of the above DisplaySettings::instance() code i,e,

export OSG_GL_CONTEXT_VERSION=4.0
export OSG_GL_CONTEXT_PROFILE_MASK=1
Yes as mentioned this also works. But the open issue for me still remains the first context created by osgEarth::Capabilities::
Capabilities (see first of the stack traces I posted above). In my view either this is a bug in osgEarth that it creates the traits without honouring the desired GL version, or OSG should otherwise ensure the traits contain the overridden GL version.

I don't think it's appropriate to classify a bug for something that is retrospectively using the software with drivers that add their own constraints.  The OSG hasn't ever worked the way you want it to work - I've only just integrated the changes to the OSG to support passing the GL versions for GLX, so only now can we start talking about passing on suggestions of how to utilize it to the osgEath team.

I see it as a constraint on osgEarth working with Intel drivers and recent GL versions and associate drivers.  What we are talking about is working around these constraints in the driver to provide to full osgEarth functionality across a wider range of hardware.  The first step has been to add the GL version functionality to OSG's GLX support. 

Whether we need to push any changes to osgEarth is something I'm not clear on, if the above export's now work with OSG master and osgEarth then I think we are most of the way to getting what is reasonable to expect.

Robert.






 

Sandro Mani

unread,
Aug 21, 2017, 1:16:20 PM8/21/17
to OpenSceneGraph Users

Hi Robert


On 21.08.2017 19:06, Robert Osfield wrote:
You should only need to OPENGL_PROFILE something other than defaults if you specifically want just a non compatible graphics context.  It sounds like the Intel driver might require OPENGL_PROFILE if you want to enable the latest GL features, something you won't need under NVIidia and possible AMD. Try using the OSG_GL_CONTEXT_VERSION and OSG_GL_CONTEXT_PROFILE_MASK env vars instead of the above DisplaySettings::instance() code i,e,

export OSG_GL_CONTEXT_VERSION=4.0
export OSG_GL_CONTEXT_PROFILE_MASK=1
Ok I'll fire off a build without passing any cmake options and check what happens when I just set the environment variables.

Yes as mentioned this also works. But the open issue for me still remains the first context created by osgEarth::Capabilities::
Capabilities (see first of the stack traces I posted above). In my view either this is a bug in osgEarth that it creates the traits without honouring the desired GL version, or OSG should otherwise ensure the traits contain the overridden GL version.

I don't think it's appropriate to classify a bug for something that is retrospectively using the software with drivers that add their own constraints.  The OSG hasn't ever worked the way you want it to work - I've only just integrated the changes to the OSG to support passing the GL versions for GLX, so only now can we start talking about passing on suggestions of how to utilize it to the osgEath team.
Sure. However I'm curious, how is the logic supposed to work under Windows, where support for specifying the GL version was already implemented? Wouldn't you hit the same issues on that platform also?


I see it as a constraint on osgEarth working with Intel drivers and recent GL versions and associate drivers.  What we are talking about is working around these constraints in the driver to provide to full osgEarth functionality across a wider range of hardware.  The first step has been to add the GL version functionality to OSG's GLX support. 

Whether we need to push any changes to osgEarth is something I'm not clear on, if the above export's now work with OSG master and osgEarth then I think we are most of the way to getting what is reasonable to expect.
Not that I want to be annoying or repetitive, but surely it's not ideal that osgEarth::Capabilities::Capabilities performs it's checks with a 1.0 context, even though you specify OSG_GL_CONTEXT_VERSION=4.0? Or am I missing something here?

Best
Sandro

Robert Osfield

unread,
Aug 21, 2017, 1:46:40 PM8/21/17
to OpenSceneGraph Users
HI Sandro,

On 21 August 2017 at 18:16, Sandro Mani <manis...@gmail.com> wrote:
Sure. However I'm curious, how is the logic supposed to work under Windows, where support for specifying the GL version was already implemented? Wouldn't you hit the same issues on that platform also?

So far it's be a niche feature used by small number of developers on their specific applications/platform combinations.  Functionality in the OSG community gets developed on a need basis, so if the functionality isn't there yet then it's because few people have required it.  There are so many things one can do in real-time graphics one has to spend ones time on the most important issues one hits upon with your own applications.


 


I see it as a constraint on osgEarth working with Intel drivers and recent GL versions and associate drivers.  What we are talking about is working around these constraints in the driver to provide to full osgEarth functionality across a wider range of hardware.  The first step has been to add the GL version functionality to OSG's GLX support. 

Whether we need to push any changes to osgEarth is something I'm not clear on, if the above export's now work with OSG master and osgEarth then I think we are most of the way to getting what is reasonable to expect.
Not that I want to be annoying or repetitive, but surely it's not ideal that osgEarth::Capabilities::Capabilities performs it's checks with a 1.0 context, even though you specify OSG_GL_CONTEXT_VERSION=4.0? Or am I missing something here?

Having the the OSG to worry about is enough for my little brain, I can't comment on the specifics of 3rd party software that builds upon it.  osgEarth has evolved alot over the years, the Capabilities functionality is something I'm not familiar with at all.  All I can say in general is that one would typically check for functionality supported by the driver once a graphics context has been created, if one did create a dummy context just testing functionality then you'd want to make sure it's created with the same settings as you create the final viewer windows. 

If osgEarth doesn't do this then it's something to take up with the osgEarth team,

Robert.



Sandro Mani

unread,
Aug 21, 2017, 1:50:46 PM8/21/17
to OpenSceneGraph Users

Hi Robert


Having the the OSG to worry about is enough for my little brain, I can't comment on the specifics of 3rd party software that builds upon it.  osgEarth has evolved alot over the years, the Capabilities functionality is something I'm not familiar with at all.  All I can say in general is that one would typically check for functionality supported by the driver once a graphics context has been created, if one did create a dummy context just testing functionality then you'd want to make sure it's created with the same settings as you create the final viewer windows. 

If osgEarth doesn't do this then it's something to take up with the osgEarth team,
That's a good enough answer for me, ultimately I just want to understand what needs to be fixed where. From your answer I gather that OSG should be okay now and that I'll take up the rest with osgEarth upstream.

Best,
Sandro

Robert Osfield

unread,
Aug 22, 2017, 3:17:58 AM8/22/17
to OpenSceneGraph Users
HI Sandro,

On 21 August 2017 at 18:50, Sandro Mani <manis...@gmail.com> wrote:
That's a good enough answer for me, ultimately I just want to understand what needs to be fixed where. From your answer I gather that OSG should be okay now and that I'll take up the rest with osgEarth upstream.

The change to use DisplaySettings::instance() by default in Triats is something I'm not comfortable with as it changes the behaviour which may catch some OSG users out, and if they did find problems they would be hard to diagnose.

For osgEarth it does sound like the Capabilities mechanism needs tightening up to make sure the same context, or at least type of context, is used for the viewer is used when when checking the driver capabilities.

From your description it looks like the Capabilties mechanism would work fine on drivers that don't have multiple behaviours, so link to a standard GL context and then you can query all the features/extensions just fine (this is how GL has traditionally worked over the years) but fails to handle newer GL drivers that allow for specific generations of functionality to be split i.e. you get a GL3.x context with no backwards compatibility or a GL1.x context with no forwards computability.

When expressed this way it shows that the GL core profile approach that the ARB took really fundamentally changes OpenGL, breaking one of it's greatest assets as programmer developing software that is long lived.  NVidia for sure have maintained the old way and remain easy to manage, but Apple and Intel/MESA have taken the opportunity for use non compatibility to break the old way of doing things, and breaking approaches like osgEarth::Capabilities.

Robert.


 
Reply all
Reply to author
Forward
0 new messages