Hey there,
Has anyone tried interfacing with a OpenGL context that was initialized using a core profile? I'm trying to use the 4.3 profile, and I've tried with 3.3 as well.
I have only tested this on nvidia hardware, and I'm seeing GrGLCreateNativeInterface fall apart when it tries to execute the GrGLHasExtensionFromString("GL_NV_*", extString) methods. (skia is checking for several nvidia specific GL extensions)
My guess is that since my card does support those extensions, skia attempts to use them but they are not recognized under any core profiles which causes the application to throw an error at run-time.
For the time being my application runs fine without defining a core profile, but it's something I would like to enforce if possible. I have temporarily commented out all references to nvidia extension initialization from my local skia repository which seems to have done the trick, except now I've come across some other GL run-time errors:
---- glGetError 0x500(Invalid Enum) at
d:\dev\lib\skia\src\gpu\gl\grglcontextinfo.cpp(56) :
GetString(0x1F02)
---- glGetError 0x500(Invalid Enum) at
d:\dev\lib\skia\src\gpu\gl\grglutil.cpp(204) :
GetString(0x8B8C)
---- glGetError 0x500(Invalid Enum) at
d:\dev\lib\skia\src\gpu\gl\grglcontextinfo.cpp(73) :
GetString(0x1F03)
---- glGetError 0x500(Invalid Enum) at
d:\dev\lib\skia\src\gpu\gl\grgpugl.cpp(411) :
Disable(0x0B10)
---- glGetError 0x500(Invalid Enum) at
d:\dev\lib\skia\src\gpu\gl\grgpugl.cpp(414) :
Disable(0x0B42)
---- glGetError 0x500(Invalid Enum) at
d:\dev\lib\skia\src\gpu\gl\grgpugl.cpp(419) :
Disable(0x0BF1)
Granted it's late and I have not had a chance to dive into these in detail. I'll see what I can do in the next couple of days, time permitting.