Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

GLSL 3.30 is not supported (Ubuntu 12.04, Intel HD Graphics 3000 and NVIDIA Graphics with Optimus)

5,418 views
Skip to first unread message

codingpo...@gmail.com

unread,
Jan 23, 2013, 9:05:05 PM1/23/13
to
The system:
Dell Latitude E6520

Video Card
Intel® HD Graphics 3000
NVIDIA® NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus

Ubuntu 12.04

I installed bumblebee.

I installed PyOpenGL and am following the tutorial (http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml)

Result on Python says:
- - - - - - - - - - -
RuntimeError: ('Shader compile failure (0): 0:1(10): error: GLSL 3.30 is not supported. Supported versions are: 1.00 ES, 1.10, 1.20, and 1.30\n\n', ['#version 330\n void main() {\n gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;\n }'], GL_VERTEX_SHADER)
- - - - - - - - - - -

I know NVIDIA Graphics with Optimus can not be supported on Ubuntu.
But I think Intel Graphics should support the latest version of OpenGL.

(1) What should I do? Can I update something like drivers to make Intel Graphics support the GLSL 3.30?

(2) If I can not, how can I use lower version of OpenGL in PyOpenGL?
On http://pyopengl.sourceforge.net/, it writes:
PyOpenGL 3.0.2 includes support for:
OpenGL v1.1 through 4.3

So there would be an option to set OpenGL at a lower version. But I failed to find the way to do it.

Help!! Thanks in advance!

Linda Li

unread,
Jan 23, 2013, 10:43:30 PM1/23/13
to

(1) >>So there would be an option to set OpenGL at a lower version. But I failed to find the way to do it.

I found the option in the code. Need to specify the version.

(2) From the http://en.wikipedia.org/wiki/GLSL#Versions, the corresponding GLSL versions are
GLSL version OpenGL version
1.30.10 3.0
1.40.08 3.1
1.50.11 3.2
3.30.6 3.3

So it seems Intel Graphics 3000 support OpenGL Version 3.0.

I went to the Intel official website and Linux driver website, fail to find the answer.
(Drivers for Linux*
http://www.intel.com/support/graphics/sb/CS-010512.htm

Linux Graphics
https://01.org/linuxgraphics/search/node/HD%20Graphics%203000)

But now I almost know I will use OpenGL 3.0 and only follow tutorials for OpenGL 3.0. Who could recommend good ones?



Lars Pensjö

unread,
Jan 24, 2013, 9:15:20 AM1/24/13
to
I am using BumbleBee with Ubuntu and Dell with NVIDIA, and it works.

I don't remember the details now, but you have to start the application with "optirun". Or, you simply do "optirun bash". That will make your application use the NVIDIA card instead of HD3000.

Regarding HD3000, it doesn't support OpenGL 3.3, as you say. But, you can use most functionality from OpenGL 3.3 anyway. I don't know how Python GL works, you may need to check the extensions. I got most of it working, except Uniform Buffers.

Linda Li

unread,
Jan 24, 2013, 4:36:45 PM1/24/13
to
On Thursday, January 24, 2013 8:15:20 AM UTC-6, Lars Pensjö wrote:
> I am using BumbleBee with Ubuntu and Dell with NVIDIA, and it works.

What language do you use?
I am trying it on python, using pyOpenGL. But the tutorial is not that friendly to newbies. I am thinking to change one..


>
> I don't remember the details now, but you have to start the application with "optirun". Or, you simply do "optirun bash". That will make your application use the NVIDIA card instead of HD3000.
>

I tried the following commands:
optirun bash
Python
execfile("Tutoiral1_OpenGLContextPython.py")

# the code is from http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml

RuntimeError: ('Shader compile failure (0): 0(3) : error C7533: global variable gl_ModelViewProjectionMatrix is deprecated after version 120\n0(3) : error C7533: global variable gl_Vertex is deprecated after version 120\n', ['#version 330\n void main() {\n gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;\n }'], GL_VERTEX_SHADER)

If I change 330 to 130, it works.
Just whenever I close the window for openGL, it writes:
[VGL] ERROR: in getglxdrawable--
[VGL] 177: Window has been deleted by window manager


>
> Regarding HD3000, it doesn't support OpenGL 3.3, as you say. But, you can use most functionality from OpenGL 3.3 anyway. I don't know how Python GL works, you may need to check the extensions. I got most of it working, except Uniform Buffers.
>

If HD3000 can support OpenGL 3.0, why should I use optirun, if I am fine with OpenGL 3.0. However, from the above experiment, if I do not run "optirun bash", the python code simply does not work. Why?

Linda Li

unread,
Jan 24, 2013, 4:57:41 PM1/24/13
to
On Thursday, January 24, 2013 3:36:45 PM UTC-6, Linda Li wrote:
> On Thursday, January 24, 2013 8:15:20 AM UTC-6, Lars Pensjö wrote:
> If HD3000 can support OpenGL 3.0, why should I use optirun, if I am fine with OpenGL 3.0. However, from the above experiment, if I do not run "optirun bash", the python code simply does not work. Why?

Sorry, I think I am tortured a little bit and got my head spin a little bit.
Actually I test: If I do not run optirun bash, just change version 330 to 130, it will work.

So what advantage to run optirun rush?
And does NVIDIA® NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus support OpenGL 3.3?

I used the command:
glxinfo|more
finding out:
OpenGL version string: 3.0 Mesa 8.0.4
OpenGL shading language version string: 1.30



Fabien R

unread,
Jan 25, 2013, 3:17:01 AM1/25/13
to
On 24/01/2013 22:57, Linda Li wrote:
> And does NVIDIA� NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus support OpenGL 3.3?
>
> I used the command:
> glxinfo|more
> finding out:
> OpenGL version string: 3.0 Mesa 8.0.4
> OpenGL shading language version string: 1.30
Your Mesa driver does not support OpenGL 3.3
See http://www.mesa3d.org/faq.html

-
Fabien

Linda Li

unread,
Jan 25, 2013, 12:36:42 PM1/25/13
to
Thanks. I am just confused:
So it seems we depend on two: one is the hardware, the other is the software.

Actually I also installed PyOpenGL 3.0.2, which supports for:
OpenGL v1.1 through 4.3

What relationship with Mesa?

With NVIDIA NVSTM 4200M Discrete Graphics with Optimus, does it support OpenGL 3.3?



On Friday, January 25, 2013 2:17:01 AM UTC-6, Fabien R wrote:
> On 24/01/2013 22:57, Linda Li wrote:
>
> > And does NVIDIA� NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus support OpenGL 3.3?

Linda Li

unread,
Jan 25, 2013, 1:40:07 PM1/25/13
to
According to the wiki for OpenGL: "The OpenGL specification describes an abstract API for drawing 2D and 3D graphics. "
It seems different OSs have different libraries to implement API?

And Ubuntu only has Mesa 3D graphics librfary to implement OpenGL, and the latest version only supports OpenGL 3.1.

So to implement OpenGL apps, there are two factors to consider about:
Hardware and its driver + OS OpenGL libraries

For Ubuntu, I can only use OpenGL up to version 3.1.

Lars Pensjö

unread,
Jan 31, 2013, 4:48:44 AM1/31/13
to
I am no expert on this, someone please correct me if I got it wrong.

As far as I know, the Mesa driver is an open source driver. However, there are proprietary OpenGL drivers available for Ubuntu also. The NVIDIA and AMD supports OpenGL 4.2, and there may already be prereleases available for 4.3.

Linda Li

unread,
Jan 31, 2013, 3:53:24 PM1/31/13
to
Thanks. So this is exactly my question in the post 4 of this topic:
--------------------
I tried the following commands:
optirun bash
Python
execfile("Tutoiral1_OpenGLContextPython.py")

# the code is from http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml

RuntimeError: ('Shader compile failure (0): 0(3) : error C7533: global variable gl_ModelViewProjectionMatrix is deprecated after version 120\n0(3) : error C7533: global variable gl_Vertex is deprecated after version 120\n', ['#version 330\n void main() {\n gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;\n }'], GL_VERTEX_SHADER)

If I change 330 to 130, it works.
--------------------
Whether I use "optirun bash" or not, the situation is the same: that is: it seems it supports 130 not 330.

So my question is:
(a) If NVIDIA® NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus supports OpenGL 3.3,
when I use optirun bash, it should work.
But it not. Why?

(b) Or maybe NVIDIA® NVSTM 4200M (DDR3 512MB) Discrete Graphics with Optimus does not support OpenGL 3.3.
But in my google searching, it is not the case.

Nobody

unread,
Jan 31, 2013, 4:47:10 PM1/31/13
to
On Thu, 31 Jan 2013 12:53:24 -0800, Linda Li wrote:

> error C7533: global variable gl_ModelViewProjectionMatrix is deprecated
> after version 120.

> If I change 330 to 130, it works.
> --------------------
> Whether I use "optirun bash" or not, the situation is the same: that is:
> it seems it supports 130 not 330.

No, the situation is that your code is GLSL 1.2, not GLSL 3.3. If it
didn't support GLSL 3.3, you'd get a different error message (something
along the lines of "version 330 not supported".

Newer versions of GLSL are not necessarily backward compatible with older
versions, so if you declare your code as being GLSL 3.3 but it uses
features which are no longer present, you will get an error such as the
one above.

The compatibility uniforms (e.g. gl_ModelViewProjectionMatrix) should
still be supported in 3.3 (and even 4.3), provided that you are using a
compatibility profile context. If you're using a core profile context,
you'll have to avoid using deprecated features.

Linda Li

unread,
Feb 1, 2013, 12:03:28 PM2/1/13
to
The code is for GLSL 3.3.
As mentioned in my first post, I followed the code in the tutorial (http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml)
- - - - - -
VERTEX_SHADER = shaders.compileShader("""#version 330
void main() {
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}""", GL_VERTEX_SHADER)
- - - - - -

I meant, when I change "330" to "130", it works.

Actually I thought I found the answer:
Running of GLSL depends on: hardware, hardware driver, OS installed OpenGL libraries.

So in one my post I made a note of what I learned through online search:
Is it related to Ubuntu, which uses mesa OpenGL, which currently only support OpenGL 3.1?

Nobody

unread,
Feb 2, 2013, 9:46:48 PM2/2/13
to
On Fri, 01 Feb 2013 09:03:28 -0800, Linda Li wrote:

> The code is for GLSL 3.3.

> - - - - - -
> VERTEX_SHADER = shaders.compileShader("""#version 330
> void main() {
> gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
> }""", GL_VERTEX_SHADER)
> - - - - - -

Actually, it's for "3.3 compatibility profile". It won't work in "3.3 core
profile" because both gl_Vertex and gl_ModelViewProjectionMatrix are
only available in the compatibility profile.

> I meant, when I change "330" to "130", it works.

I know. gl_Vertex and gl_ModelViewProjectionMatrix exist (and are not
deprecated) in 1.3, so it's safe to use them with "#version 130".

One factor which I overlooked in my previous reply is that a #version
statement also selects a profile. If a profile isn't specified explicitly,
it defaults to "core" (at least, that's what the standard specifies).

You might try using:

#version 330 compatibility

to select the compatibility profile instead.

Linda Li

unread,
Feb 4, 2013, 1:51:43 PM2/4/13
to
Thank you.

It seems "compatibility" does not work.
When I changed the .py file as you instructed, the error information is as below:
RuntimeError: ('Shader compile failure (0): 0:1(15): preprocessor error: syntax error, unexpected IDENTIFIER, expecting NEWLINE\n', ['#version 330 compatibility\n void main() {\n gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;\n }'], GL_VERTEX_SHADER)


And actually I have a question:
Do you think with the supported version of mesa OpenGL in Ubuntu is 3.1, no matter how the hardware graphic card is powerful, all OpenGL programs in Ubuntu can only support up to 3.1?

Nobody

unread,
Feb 4, 2013, 4:09:57 PM2/4/13
to
On Mon, 04 Feb 2013 10:51:43 -0800, Linda Li wrote:

> It seems "compatibility" does not work. When I changed the .py file as you
> instructed, the error information is as below: RuntimeError: ('Shader
> compile failure (0): 0:1(15): preprocessor error: syntax error, unexpected
> IDENTIFIER, expecting NEWLINE\n', ['#version 330 compatibility\n
> void main() {\n gl_Position = gl_ModelViewProjectionMatrix *
> gl_Vertex;\n }'], GL_VERTEX_SHADER)

That appears to be a bug in the GLSL preprocessor. It works fine with an
ATI card on Windows (the only suitable test system I have available), and
the standard is fairly clear on this (the #version directive supports a
profile in all versions from 1.5 onwards).

> And actually I have a question:
> Do you think with the supported version of mesa OpenGL in Ubuntu is 3.1,
> no matter how the hardware graphic card is powerful, all OpenGL programs
> in Ubuntu can only support up to 3.1?

The version of libGL.so shouldn't affect GLSL, as it should just pass the
source code directly to the driver. Using functions which don't exist in
3.1 would be a problem, as those functions don't exist in the libGL.so
library.

However, if the back-end driver is based upon Mesa (as opposed to a
proprietary driver), that would explain it; the Mesa 9.0 source code
doesn't recognise the profile part of a #version directive:

http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl/glsl_parser.yy?h=9.0#n254

This appears to be fixed in the trunk version (9.1):

http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl/glsl_parser.yy#n263

But IIRC, nVidia provides their own libGL.so (other drivers just provide
the kernel DRM module and the X11 DRI module), so Mesa shouldn't be
involved when using the proprietary nVidia driver.

Linda Li

unread,
Feb 4, 2013, 7:03:36 PM2/4/13
to

(1)
I apologize. When I tried your suggestion (changing #version 330 to #version 330 compatibility), I did not include the command "optirun bash".

Now I include it, and it works (you are right)!

Thanks a lot!

(2)
So from your explanation, my understanding is:
for GLSL, if we use proprietary graphic card driver, we do not need to install any OpenGL libraries, as "it should just pass the source code directly to the driver"?

But now I am confused about the difference between graphic card drivers and installed OpenGL libraries.

If we have proprietary graphic card drivers, do we need to install mesa in Linux?
If so (I think it is), why?


Thanks in advance.

Nobody

unread,
Feb 5, 2013, 7:45:39 PM2/5/13
to
On Mon, 04 Feb 2013 16:03:36 -0800, Linda Li wrote:

> (2)
> So from your explanation, my understanding is: for GLSL, if we use
> proprietary graphic card driver, we do not need to install any OpenGL
> libraries, as "it should just pass the source code directly to the
> driver"?
>
> But now I am confused about the difference between graphic card drivers
> and installed OpenGL libraries.
>
> If we have proprietary graphic card drivers, do we need to install mesa in
> Linux? If so (I think it is), why?

You need libGL.so, as that's what the application talks to. I.e. that's
where OpenGL programs like glUseProgram() are defined.

But libGL is basically just a conduit. It either encodes the commands as
GLX protocol and passes them to the X server, which then passes them to
the driver (indirect rendering), or it passes the commands directly to the
video driver (direct rendering).

Mesa provides both libGL and various open-source video drivers. Mesa's
libGL works with any of Mesa's drivers, as well as for indirect rendering.

Proprietary drivers often include a separate libGL, which may or may not
work with indirect rendering, and almost certainly won't work if you have
multiple video cards from different vendors and want to use them
simultaneously from the same process. Actually, just being able to switch
from one to the other without having to uninstall and reinstall drivers
isn't always straightforward.

On the plus side, proprietary versions of libGL tend to be ABI-compatible
with the Mesa version and each other, so you don't need separate versions
of each OpenGL-based program.

Linda Li

unread,
Feb 6, 2013, 12:22:07 PM2/6/13
to
This is very informative and clear my confusion.
Thanks a lot!
0 new messages