A less invasive alternative could be to introduce a new singleton called OpenGLSettings or similar which contains just the three settings glContextVersion, glContextFlags, glContextProfileMask.
Hi Sandro,
On 21 August 2017 at 13:21, Sandro Mani <manis...@gmail.com> wrote:
A less invasive alternative could be to introduce a new singleton called OpenGLSettings or similar which contains just the three settings glContextVersion, glContextFlags, glContextProfileMask.
I don't see how this is different, if you are setting setting a default that users might not realize is now being set differently, it's just the same issue expressed in a slightly different way.
--
Currently when the osgViewer::Viewer is constructed and run() is called without the viewer explictly setting up the graphics context it's fallback to using the setUpViewAcrossAllScreens() and within this implementation when it sets up the Traits it passes the DisplaySetting assigned to the Viewer or DisplaySetting::instance() will be used if none is assigned. This fallback is functionally the same as you are after and this already will be working.
For places where the windows are being explicitly set up then they will be creating a Traits themselves and at this point I'm happy for them to need to explicitly specify the settings or pass in the DisplaySettings they want to use as defaults.
Hi Robert
Hi Sandro,
What hardware and drivers are you using?
Are the osgEarth team aware of the issues you've had?
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 17.2.0-rc4
OpenGL core profile shading language version string: 4.50
I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).
Are the osgEarth team aware of the issues you've had?
Hi Robert
Hi Sandro,
On 21 August 2017 at 15:21, Sandro Mani <manis...@gmail.com> wrote:
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 17.2.0-rc4
OpenGL core profile shading language version string: 4.50
Is there a reason why the driver isn't just directly supporting the GL features that osgEarth is using? Is the Intel/Mesa driver limiting features in some way?
Have you tried on an NVIdia or AMD system?
FYI, I'm using NVidia under Kubunty 16.04 as my main build system and routinely mix latest GL features with just creating a normal graphics context,
I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).
Are the osgEarth team aware of the issues you've had?
My guess is that they probably haven't used Linix+Intel with the drivers that you are using so haven't come across the issue. Real-time graphics under Linux has tended to mostly done with NVidia as Intel and AMD have had sub-standard GL drivers for looooong time.
I suspect it's not really a case of the osgEarth team doing something wrong, but the Linux+Intel drivers adding a new constraint for keeping things working sweetly.
I'm starting to wonder if you aren't hitting upon the same issues that Apple OSG users have had with Apple's decision to only support OpenGL 3+ features on a graphics context without compatibility. While it seems a uncontroversial decision at first glance it's ended up being a real pain for OpenGL users needing to maintain long lived applications that happen to rely upon both newer features when available as well as old fixed function pipeline features. The Apple approach is fine for clean room application written recently such as new games but crap for the many companies that develop software that has a useful life that's over a decade long.
Someone please correct me if I'm using the wrong terminology, but as far as I understand mesa (possibly restricted to the intel mesa drivers, but possibly also others such as ati and nouveau) only exposes GL3.2+ functionality through the corresponding core profile. Compatibility profile is only implemented up to GL3.0 on my system:
Vendor: Intel Open Source Technology Center (0x8086)
Device: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) (0x191b)
Version: 17.2.0
Accelerated: yes
Video memory: 3072MB
Unified memory: yes
Preferred profile: core (0x1)
Max core profile version: 4.5
Max compat profile version: 3.0
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.2
No, but if I'm not mistaken they have a full(?) compatibility profile.Have you tried on an NVIdia or AMD system?
I don't think it's adding a constraint, rather than the mesa developers focusing first on adding support for extensions required to expose the newest possible version in the core profile, and then working on the compatibility profile as time allows.
FYI, I'm using NVidia under Kubunty 16.04 as my main build system and routinely mix latest GL features with just creating a normal graphics context,
I once asked before investigating [1] but no-one reacted, I suppose they are mostly using Windows. Now I have a better understanding of the underlying issues (and indeed of what was missing in OSG itself), but before reporting issues to the osgEarth team I'd like to understand what the correct approach should be (if indeed they are doing something wrong).
Are the osgEarth team aware of the issues you've had?
My guess is that they probably haven't used Linix+Intel with the drivers that you are using so haven't come across the issue. Real-time graphics under Linux has tended to mostly done with NVidia as Intel and AMD have had sub-standard GL drivers for looooong time.
I suspect it's not really a case of the osgEarth team doing something wrong, but the Linux+Intel drivers adding a new constraint for keeping things working sweetly.
It is likely that the issues are of similar nature yes. But since, with the changes from the PR + the proposed DisplaySettings change, I'm able to run osgEarth fine on this system, I'd say the situation is not that bad here, it's just a matter of clarifying how the profile version should be set.
I'm starting to wonder if you aren't hitting upon the same issues that Apple OSG users have had with Apple's decision to only support OpenGL 3+ features on a graphics context without compatibility. While it seems a uncontroversial decision at first glance it's ended up being a real pain for OpenGL users needing to maintain long lived applications that happen to rely upon both newer features when available as well as old fixed function pipeline features. The Apple approach is fine for clean room application written recently such as new games but crap for the many companies that develop software that has a useful life that's over a decade long.
Hi Robert
What GL version is osgEarth now depending upon?
From an end users perspectives lack of compatibility profile is a constraint, if it weren't you wouldn't have had any problems. From what you describe it may well be a temporary constraint.
Are you building the OSG with defaults? Or are you building the OSG for just GL3?
What viewer are using with osgEarth? osgEarth itself is typically used as NodeKit with end user application creating their own viewer and with it graphics contexts. This means even if osgEarth's own example programs changed the way they create graphics context you'd end up with issues.
Are you building the OSG with defaults? Or are you building the OSG for just GL3?
%cmake -DBUILD_OSG_EXAMPLES=ON -DBUILD_DOCUMENTATION=ON -DOSG_GL1_AVAILABLE=ON -DOSG_GL2_AVAILABLE=ON -DOPENGL_PROFILE=GLCORE -DOPENGL_HEADER1="#include <GL/gl.h>"
Hi Sandro,
On 21 August 2017 at 17:21, Sandro Mani <manis...@gmail.com> wrote:
Are you building the OSG with defaults? Or are you building the OSG for just GL3?
%cmake -DBUILD_OSG_EXAMPLES=ON -DBUILD_DOCUMENTATION=ON -DOSG_GL1_AVAILABLE=ON -DOSG_GL2_AVAILABLE=ON -DOPENGL_PROFILE=GLCORE -DOPENGL_HEADER1="#include <GL/gl.h>"
That's a bit of ugly mix.
Things should work with just:
cmake .
make -j 4 # 4 is number of cores available
When you using the OPENGL_PROFILE it'll set the OSG_*_AVAILABLE options for you, but in general build with defaults unless you specifically need a particular feature set.
Agreed, I didn't put much though into it, I just checked that I hit the OSG_GL3_FEATURES when doing the work on the context.
Well, in this case I needed it though, right? I mean, OPENGL_PROFILE defaults to "GL2".Things should work with just:
cmake .
make -j 4 # 4 is number of cores available
When you using the OPENGL_PROFILE it'll set the OSG_*_AVAILABLE options for you, but in general build with defaults unless you specifically need a particular feature set.
Yes as mentioned this also works. But the open issue for me still remains the first context created by osgEarth::Capabilities::export OSG_GL_CONTEXT_VERSION=4.0
export OSG_GL_CONTEXT_PROFILE_MASK=1Capabilities (see first of the stack traces I posted above). In my view either this is a bug in osgEarth that it creates the traits without honouring the desired GL version, or OSG should otherwise ensure the traits contain the overridden GL version.
Hi Robert
You should only need to OPENGL_PROFILE something other than defaults if you specifically want just a non compatible graphics context. It sounds like the Intel driver might require OPENGL_PROFILE if you want to enable the latest GL features, something you won't need under NVIidia and possible AMD. Try using the OSG_GL_CONTEXT_VERSION and OSG_GL_CONTEXT_PROFILE_MASK env vars instead of the above DisplaySettings::instance() code i,e,
export OSG_GL_CONTEXT_VERSION=4.0
export OSG_GL_CONTEXT_PROFILE_MASK=1
Yes as mentioned this also works. But the open issue for me still remains the first context created by osgEarth::Capabilities::Capabilities (see first of the stack traces I posted above). In my view either this is a bug in osgEarth that it creates the traits without honouring the desired GL version, or OSG should otherwise ensure the traits contain the overridden GL version.
I don't think it's appropriate to classify a bug for something that is retrospectively using the software with drivers that add their own constraints. The OSG hasn't ever worked the way you want it to work - I've only just integrated the changes to the OSG to support passing the GL versions for GLX, so only now can we start talking about passing on suggestions of how to utilize it to the osgEath team.
I see it as a constraint on osgEarth working with Intel drivers and recent GL versions and associate drivers. What we are talking about is working around these constraints in the driver to provide to full osgEarth functionality across a wider range of hardware. The first step has been to add the GL version functionality to OSG's GLX support.
Whether we need to push any changes to osgEarth is something I'm not clear on, if the above export's now work with OSG master and osgEarth then I think we are most of the way to getting what is reasonable to expect.
Sure. However I'm curious, how is the logic supposed to work under Windows, where support for specifying the GL version was already implemented? Wouldn't you hit the same issues on that platform also?
Not that I want to be annoying or repetitive, but surely it's not ideal that osgEarth::Capabilities::Capabilities performs it's checks with a 1.0 context, even though you specify OSG_GL_CONTEXT_VERSION=4.0? Or am I missing something here?
I see it as a constraint on osgEarth working with Intel drivers and recent GL versions and associate drivers. What we are talking about is working around these constraints in the driver to provide to full osgEarth functionality across a wider range of hardware. The first step has been to add the GL version functionality to OSG's GLX support.
Whether we need to push any changes to osgEarth is something I'm not clear on, if the above export's now work with OSG master and osgEarth then I think we are most of the way to getting what is reasonable to expect.
Hi Robert
Having the the OSG to worry about is enough for my little brain, I can't comment on the specifics of 3rd party software that builds upon it. osgEarth has evolved alot over the years, the Capabilities functionality is something I'm not familiar with at all. All I can say in general is that one would typically check for functionality supported by the driver once a graphics context has been created, if one did create a dummy context just testing functionality then you'd want to make sure it's created with the same settings as you create the final viewer windows.
If osgEarth doesn't do this then it's something to take up with the osgEarth team,
That's a good enough answer for me, ultimately I just want to understand what needs to be fixed where. From your answer I gather that OSG should be okay now and that I'll take up the rest with osgEarth upstream.