Performance with Mesh under Windows 11 extremely slow

500 views
Skip to first unread message

Tom Pollok

unread,
Oct 19, 2021, 8:20:25 AM10/19/21
to OpenSceneGraph Users
Dear Robert,

i tried OSG on my new high performance Laptop (4K Resolution, Nvidia RTX 3080, 16GB VRAM, Intel i9-11900, 32GB RAM) with Windows 11 (release version with latest updates) installed.
I also updated to the latest Nvidia Driver 496.13 today.

Unfortunately my OSG performance dropps massively when i enable the rendering of my textured mesh. I added two screenshots with the drawing performance.

Do you have any idea where this issue could come from?

Best regards,
Tom
No_Mesh.png
with_Mesh.png

Robert Osfield

unread,
Oct 19, 2021, 9:27:26 AM10/19/21
to OpenSceneGraph Users
No, I have no idea, and nothing to go on.

What "textured mesh" is in your case is likely key, but as you say nothing what "textured mesh" entails I can't speculate what might be amiss.
 

 

Tom Pollok

unread,
Oct 19, 2021, 12:17:49 PM10/19/21
to OpenSceneGraph Users
Its just a large surface geometry with 440K vertices and 870K faces. I removed the texture and i still have the same issue.

I opened this mesh also using MeshLab Release MeshLab-2021.07 · cnr-isti-vclab/meshlab · GitHub and Microsofts 3D Viewer and it worked with a high framerate.

Here is the mesh if anybody wants to try it out.

https://drive.google.com/file/d/1yCF0FvnggnNJTMODaKGtbsUsjgL8nX_A/view?usp=sharing

Are there other ways in which i could contribute debug information?

Brad Colbert

unread,
Oct 19, 2021, 6:19:41 PM10/19/21
to osg-...@googlegroups.com
Tom,

Are you sure that your laptop isn't a "dual chip" version and that you are not running on the non nvidia / non-accelerated chip?

--
You received this message because you are subscribed to the Google Groups "OpenSceneGraph Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to osg-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/osg-users/b24adace-3659-4983-8007-a1fb4a250d5fn%40googlegroups.com.


--

Tom Pollok

unread,
Oct 19, 2021, 7:26:11 PM10/19/21
to osg-...@googlegroups.com
Hello Brad,

thanks for your reply. Indeed the laptop has two graphics cards. Is there a way how to see during runtime which card is used in openscenegraph? Is there also a way how to hint openscenegraph which to prefer? Or would I have to set environment variables prior to the program start? 

GRAPHICS
  • Discrete: NVIDIA® GeForce RTX™3080 (16GB DDR6 VRAM)
  • Integrated: Intel® UHD Graphics

On 20. Oct 2021, at 00:19, Brad Colbert <bcol...@rscusa.com> wrote:


You received this message because you are subscribed to a topic in the Google Groups "OpenSceneGraph Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/osg-users/sa5yU5PhGa8/unsubscribe.
To unsubscribe from this group and all its topics, send an email to osg-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/osg-users/CAHN6942XVPBLgFM6ebQ0WqDPhWduXXntspZ4SiCEu5EWCSG%3Dbw%40mail.gmail.com.

Tom Pollok

unread,
Oct 19, 2021, 7:43:19 PM10/19/21
to OpenSceneGraph Users
I was able to see in my task manager, that the intel graphicscard was used. So I disabled my Intel graphics card.

However now my application crashes on startup.

Error: OpenGL version test failed, requires valid graphics context.

As asked above, if anybody knows how to hint how to prefer the discrete graphics card, id be very thankful if id know how to.

Werner Modenbach

unread,
Oct 20, 2021, 6:10:35 AM10/20/21
to 'Tom Pollok' via OpenSceneGraph Users
Hi Tom,

in the NVIDIA system settings you can permanently assign graphics cards to executables.
3D-Settings->program settings.
You should also run your notebook with the power supply connected. Otherwise it goes into power savings mode.

BTW: Did you check your osg culling by providing geometries with reasonable boundings?

- Werner -

Tom Pollok

unread,
Oct 20, 2021, 2:31:28 PM10/20/21
to OpenSceneGraph Users
Dear Werner,

thank you for your help. I was setting the app settings to nvidia 3080 and also set the global settings to use the nvidia as the primary card. But when i run my application i see activity on both cards while the osg viewer has only 4 fps. So it seems like nothing has changed. What i found weird is that the application continued to crash due to the same reason as previously written, when i disabled the intel card in the device manager.

I didnt check on the boundings, as i just use the provided osg plugin to read the obj file. I dont know i was expecting at least 30 fps. This geometry to me doesnt seem a lot of work to do for the gpu compared to modern AAA games which run without issues on high settings. I wish i could simply port to VSG, but my software depends on osgEarth.

Best,
Tom

Werner Modenbach

unread,
Oct 21, 2021, 4:25:18 AM10/21/21
to 'Tom Pollok' via OpenSceneGraph Users
Hi Tom,

there are a lot of options to narrow down the problem.
- Try using a debug context in your viewer and see the messages on std::out
- Use the status info provided by OSG 'Kex S'.
- Start the viewer out of NVIDIA NSight. It has excellent analysis tools.
- Run the OSG optimizer on your scene (runtime visitor after loading the scene).
- Get a printed beam structure of your scene by the provided visitor of OSG

This is just for the beginning ... :-)

- Werner -

Tom Pollok

unread,
Oct 21, 2021, 5:49:12 AM10/21/21
to osg-...@googlegroups.com
Thank you very much Werner. I will check that out. Your help is very much appreciated.

Greetings,
Tom

On 21. Oct 2021, at 10:25, Werner Modenbach <Werner.M...@modenbach-ac.de> wrote:

 Hi Tom,
Reply all
Reply to author
Forward
0 new messages