Glsl Water Shader

55 views
Skip to first unread message

Carletta Azahar

unread,
Jul 21, 2024, 4:35:25 PM7/21/24
to ghermompsimpre

Over the weekend, I put together a very basic water simulation with GLSL (the shader language in OpenGL). I'm pretty happy with the results so far. It's amazing how much you can get out of such a small amount of code.

glsl water shader


Download ––– https://ssurll.com/2zxsy3



Step one in generating the water you see above is building a mesh for it. I started with a heightmap, set the "sea level" elevation, and copied everything below that. So the vertices of the mesh actually store depth for the water at every point. I don't use this information right now, but it could probably be used to make smaller or slower waves near the coast for more realism.

The mesh gets drawn with a vertex shader, which essentially applies a sum of sine waves to the surface. The height at each point is a function of XY position and time. The number of waves and the amplitude, wavelength, speed, and direction of each wave are configurable parameters.

The beautiful thing about modelling wave height as a fixed function like this is that you can generate normal vectors with some simple calculus. Below, I have two functions which take the partial derivatives with respect to X and Y, and another which sums them to produce a normal.

The normal vector is left in world space so it can be used for cubemap texture coordinates in the fragment shader. I just reused my skybox textures as a cubemap. That's how I got the reflection you see. I set the alpha value for the water to 0.5 so it's fairly transparent.

I have a game that runs on mobile devices (OpenGL ES 2.0) and for which I would like to create some sea water using the shaders. Now, the plane on which the sea water texture will be has only 4 vertices and I need it to remain like that (because of performance issues). So, I think, my only option is to simulate the water from the fragment shader. At conceptual level, how should I do it? I must mention I'm a beginner with the programmable pipeline (I barely learned the language, in some extent) and have no idea where should I start.

What you can do is to adapt that idea (Gerstner waves) and compute the normals for each of your rendered fragment. The way to do that would be to assign a water texture (without too much light information in it, since you're gonna compute colours on in the shader anyway). That texture also comes with a texcoord set mapping (for each of your 4 vertices you get an uv pair of texture coordinates in the range of [0,1]). All in all, you have a plane you can deform in the fragment shader: the [0,1]x[0,1] square described by your uv texcoordset. Each uv pair that will be interpolated and present in your fragment shader can be used to recover a normal vector like you would in the vertex shader. That is useful only if the water surface is relatively far away from the camera: since you can't deform/displace vertices, you can observe light reflecting in different directions and shading that surface according to a time variable. It should work in a sense and it should be easier and more convincing than distorting/moving a texture through texcoord manipulation as older game effects do.

Implementation detailsConsider the above picture to be your water surface given as a quad. When you prepare it for shader drawing, you must supply a texture to it (if you don't want to colour it solidly). When you bind a texture to it, for that to work, you mus associate to each of the vertices an attribute containing the texture coordinates. This is the so called uv set and, if there's no tiling involved on your quad, these could be given as A(0,0), B(1,0), C(1,1), D(0,1), where A, B, C, D are the vertices of your quad.

compute the position as you normally would and pass the uv you get as an attribute input variable to the fragment shader through a varying variable. The other, simpler way, could be to pass the object frame vertex coordinates (the positions prior to their multiplication through the worldvieprojection matrix). The varying descriptor will make sure the fragment gets an interpolated position on the quad. Let's call this variable xy and pass it to the frag shader as said.

Pass as an input uniform the world transformation matrix because you'll have to transform the computed normals to align with your lights position. Alternatively, pass the light already transformed in the object's own frame of reference and use the directly computed normal (should be quite faster and more elegant).

xy is a 2D vector and we need a 3D point on a water surface. Gerstner waves associate to a flat rectangle a height for each of its inner points. If you then consider the union of all of those xy displaced via that height function, you get the wave surface. We can't discuss positions in the frag shader, but we can compute and use normals for other purposes.

The normal you are looking for can be computed from the xy pair following this rationale: . For the waves to be animated, make sure to pass a time instance as an input uniform to the fragment shader. It is required as an input for the H(x,y,t) height function. That should be all. The rest is just plain lighting computation in a frag shader.

I'm trying to create an OpenGL application with water waves and refraction. I need to either cast rays from the sun and then the camera and figure out where they intersect, or I need to start from the ocean floor and figure out in which direction(s, if any) I have to go in order to hit the sun or the camera. I'm kind of stuck, can any one give me an inpoint into either OpenGL ray casting or a crash course in advanced geometry? I don't want the ocean floor to be at a constant depth and I don't want the water waves to be simple sinusoidal waves.

First things first: The effect you're trying to achieve can be implemented using OpenGL, but it is not a feature of OpenGL. OpenGL by itself is just a sophisticated triangle to screen drawing API. You got some input data and write a program that performs relatively simple rasterizing drawing operations based on the input data using the OpenGL API. Shaders give it some space; you can implement a raytracer in the fragment shader.

In your case that means, you must implement a some algorithm that generates a picture like you intend. For water is must be some kind of raytracer or fake refraction method to get the effect of looking into the water. The caustics require either a full features photon mapper, or you're good with a fake effect based on the 2nd derivative of the water surface.

This demo uses true raytracing (the water surface, the sphere and the pool are raytraced), the caustics are a "fake caustics" effect, based on projecting the 2nd derivative of the water surface heightmap.

Reflections are normally achieved by reflecting the camera in the plane of the mirror and rendering to a texture, you can apply distortion and then use it to texture the water surface. This only works well for small waves.

Techincally, all you perceive is a result of lightwaves/photons bouncing off the surfaces and propagating through mediums. For the "real deal" you'll have to trace the light directly from the Sun with each ray following the path:

I also wanted this reflection texture to include the sky. I think this is something you rarely get to see in top-down games and while the water could just show a solid colour I thought being able to see reflected clouds would be far more interesting. (Though this is specific to outdoor scenes of course)

This might not be a big deal if texture is downsampled / small resolution. But if performance is an issue, could also try looking into optimisations to avoid rendering parts of the reflection texture - maybe via stencil buffer.

This sky shader/texture is placed on a quad, assigned to the Reflection Layer (so only rendered by that second camera). The quad should be big enough to cover that camera view and parented to it (which helps give a nice parallax effect when the camera moves!)

Since the texture ranges from 0 to 1 (roughly), we Subtract a value of 0.5 to shift the range. When we distort using this later, it offsets in both directions, keeping the overall image centered (rather than shifting towards the top corner). Can also Multiply to adjust the strength for better control of how the result will look.

You may want to handle the pixelation in the shader by using a Posterize node (equivalent to floor(UV * Steps) / Steps). Though if you already plan to render at lower resolution and use upscaling (i.e. on a 2D Pixel Perfect Camera or manually via another camera & Render Texture) then you can skip the pixelation.

Add a Texture2D property in the Blackboard where we can assign the Reflection Render Texture set up earlier. (Can set it as the Default under Node Settings and/or apply to material just like any other texture asset. If you use a custom feature to generate the reflection texture instead, would likely uncheck Exposed and use cmd.SetGlobalTexture)

Put the noise into the In port on a Step node, then try adjusting the Edge value input until there is only a few white parts appearing in the preview. Since there are two sets of noise scrolling past each other (and combined additively), they occasionally line up and produce values higher than the Edge.

I initially had the water shader (using the above sections) on a quad mesh, but later swapped to applying to a Tilemap (separate from the rest of the scene that is - since they need different materials). This is so I could add a ripping effect to the edges of the water.

After importing, drag the texture into a Sprite Palette, can then select tiles and paint on the Tilemap (with brush tool). May also be able to automatically paint the correct tiles with a Rule Tiles asset, though it looks a pain to set up.

The shader samples this by using a Texture2D property with the _MainTex reference, and by applying a scrolling Sine wave we create some nice repeating ripples moving out from the edges.

e59dfda104
Reply all
Reply to author
Forward
0 new messages