I have a program that renders with ANGLE and can run on MacOS, and I was wondering how I could use the integrated graphics card instead of the discrete graphics card, I have already tried setting NSSupportsAutomaticGraphicsSwitching to true, but it makes my view go away and still uses the graphics card somehow.
Is there some way that I can force it to use the integrated GPU if available because I think that using the graphics card is overkill for my application?
Specifically I'm looking for some sort of an API / code snippet to use, and/or some values that need to be set in the info.plist file. All I'm using is a class inherited from NSView that keeps drawing itself in UpdateLayer.
My application is in C# using Xamarin.Mac, and an experimental (but functional) AngleSharp wrapper, but code snippets in any relevant language should help. Thanks!
--
You received this message because you are subscribed to the Google Groups "angleproject" group.
To unsubscribe from this group and stop receiving emails from it, send an email to angleproject...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/angleproject/a35a7b79-9d7d-474f-8a91-966461853a3fn%40googlegroups.com.

Can anyone help me with this?
To view this discussion on the web visit https://groups.google.com/d/msgid/angleproject/219ea10b-92dc-4d80-a6ba-949ecbd09065n%40googlegroups.com.
Context (0x7fead682ac00) set to GPU with ID: (4294970113).
Switch detected flag: 0x00000001, display: 69734406.
Switch detected flag: 0x00000001, display: 2077749241.
Switch detected flag: 0x0000211C, display: 69734406.
Switch detected flag: 0x00002220, display: 2077749241.
Do you think that you could make a minimum code needed for a triangle in a single file sample for an ANGLE project on MacOS that definitely uses the integrated GPU so I can try to see what I've done wrong?
To view this discussion on the web visit https://groups.google.com/d/msgid/angleproject/1bea88c9-8c09-4c6d-9599-3e3d4b58a797n%40googlegroups.com.