Hi,
In a fragment shader I read from two textures to create a result pixel.
In rare clients the result pixel is wrong as if one of the textures was garbled or missing. When a client has this symptom - it happens all the time on this client.
1. It is related to Chrome - doesn't happen using IE on the same client
2. When I use a single texture it works OK. I prefer to use two textures though to represent the image.
3. We have many lapotps of the same kind - all work OK except for a single one that shows the symptom
4. I compared the results of Chrome:\\gpu and chrome:\\flags between a good and a bad client (same hardware) - results were exactly the same
5. One client started to show this symptom after connecting to a different monitor (it was working fine before)
Any idea why this could happen? Could a monitor affect Chrome WebGl in this way? how I can detect it beforehand to fallback to using a single texture?
The only thing I could think of is to render a demo picture using two textures vs using a single texture and compare, but maybe there is a simpler solution to detect / fix the issue.
Thanks,
Raanan