GPGPU pipeline working on iOS 9, not on OS X 10.11

2 views
Skip to first unread message

Emmanuel D'ANGELO

unread,
Jan 23, 2016, 10:31:42 AM1/23/16
to perfoptimi...@lists.apple.com
Hi everyone,

I’m checking the GPGPU possibilities with Metal, and I have a small test app that creates a Compute pipeline and all the required objects, encoders, etc. This pipeline then launches a shader that writes some test data (like a gradient based on the current pixel position) to a texture and fetches this data back. The texture is created as 8 bit RGBA and accessed in the shader as floating-point, write-only (using MTLTextureusageShaderWrite in the texture descriptor).
All this seems to work fine (no error is launched at any point even when I had a look at a reflection object, arguments are bounds correctly to shader argument tables, etc).

However, while this test app works fine on iOS, I constantly get a zero-everywhere output on MacOS, and I have no clue on what’s going on or how to investigate further.
Am I missing something that would be OS X specific? Or is there any way where I could try to debug / investigate the issue?

For the hardware side, I’m using a MBP Early 2013, and I get the same issue with the integrated Intel HD4000 and the NVIDIA GT650M. I’m running OS X 10.11.3 now, but I already got the same issue with 10.11.2.

Any help would be greatly appreciated!

Best regards,

Emmanuel
_______________________________________________
Do not post admin requests to the list. They will be ignored.
PerfOptimization-dev mailing list (PerfOptimi...@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/perfoptimization-dev/perfoptimization-dev-garchive-8409%40googlegroups.com

This email sent to perfoptimization-...@googlegroups.com

Reply all
Reply to author
Forward
0 new messages