# teaching machines

## CS 488: Lab 9 – Water

Welcome to lab, which is a place where you and your peers complete exercises designed to help you learn the ideas discussed in the preceding lectures. It is intended to be a time where you encounter holes in your understanding and talk out loud with your peers and instructor.

Your instructor will bounce around between breakout rooms to check in with you. However, you are encouraged to request assistance if you find your progress blocked.

Designate one of your group to be the host. This individual will be responsible for setting up a Live Share session in Visual Studio Code and submitting your work. No team member should dominate or be expected to carry the group. All members should be writing code and contributing ideas.

### Setup

1. Open Visual Studio Code.
2. Click File / Open Folder, create a new folder, and open it.
3. With the Live Share extension installed, select View / Command Palette, and choose Live Share: Start Collaborative Session.

Non-hosts, join the session.

• Render a scene with a 4-vertex quadrilateral for a ground plane. Set the quadrilateral’s model space coordinates so that it is centered around the origin. Assume its model space and world space coordinates are the same. Color it with just its 2D texture coordinates for the time being. Its corners should be black, red, green, and yellow. Eventually the quadrilateral will be the rippling surface of a lake.
• Use a Camera that allows the user to advance and strafe along the scene and look around with the mouse. The quadrilateral should be transformed by the camera’s matrix.
• Surround the scene with a skybox—a box centered around the camera that’s textured with a cubemap. I know this cubemap works. You can use your own box from a previous lab or this one:
const positions = [
-1, -1,  1,
1, -1,  1,
-1,  1,  1,
1,  1,  1,
-1, -1, -1,
1, -1, -1,
-1,  1, -1,
1,  1, -1,
];

const faces = [
0, 1, 2,
1, 3, 2,
7, 6, 5,
6, 4, 5,
1, 5, 3,
5, 7, 3,
4, 0, 6,
0, 2, 6,
2, 3, 6,
3, 7, 6,
4, 5, 0,
5, 1, 0
];

• Upload this 3D noise volume to a 3D texture. Its resolution is 32x32x32. Since the data contains only scalar intensities, use a single-channel texture format like this:
gl.texImage3D(gl.TEXTURE_3D, 0, gl.R8, width, height, depth, 0, gl.RED, gl.UNSIGNED_BYTE, voxels);

• In lecture, we implemented environment mapping in eye space. In this lab, our normals are going to be computed in world space in the fragment shader, so it will be easier to do the environment mapping in world space too. Tweak the quadrilateral’s vertex shader in the following ways:
• Receive the camera’s world space position as a uniform named eyeWorld. Set the value of this uniform in your render method.
• Calculate an out vector named eyeVector that is the vector from the camera’s world space position to the vertex position. It doesn’t need to be normalized. It will be used in the fragment shader to index into the cubemap texture, and cubemap textures don’t need normalized coordinates.
• Add a uniform for the cubemap texture. Set its value in your render method.
• Add a uniform for the 3D noise texture. Set its value in your render method.
• Add a uniform for the elapsed time. Set its value in your render method with code like this:
const rippleSpeed = 0.0001;
waterProgram.setUniform1f('time', performance.now() * rippleSpeed % 2);

The time value will span from 0 to 2 and will act as our third texture coordinate. Assemble the full 3D texture coordinates like this:
vec3 texcoords = vec3(ftexcoords, time);

Adapt ftexcoords to whatever name you used for the quadrilateral’s 2D texture coordinates.
• Use requestAnimationFrame to infinitely schedule calls to the render method. As render is continuously run, time will oscillate and we’ll move through the planes of the 3D texture.
• Compute the normal at each fragment by finding a vectors that point right and up along the “surface” of the noise texture. To make the vectors tangent to the surface, we need to look at the differences between the neighboring texel values and this fragment’s texel value, like this:
float here = texture(noiseVolume, texcoords).r;
float right = textureOffset(noiseVolume, texcoords, ivec3(1, 0, 0)).r;
float above = textureOffset(noiseVolume, texcoords, ivec3(0, 1, 0)).r;

With the texel values in hand, we can construct the tangent vectors using what’s called forward differencing:
vec3 tangentX = vec3(1.0, right - here, 0.0);
vec3 tangentZ = vec3(0.0, above - here, 1.0);

Forward differencing is a cheap approximation of the derivative of the surface. You may see some resemblance to $f(x+h)-f(x)$ from the definition of a derivative. Cross the tangents to get the normal:
vec3 normal = normalize(cross(tangentX, tangentZ));

This normal is naturally in world space since we haven’t transformed it in any way.
• Reflect the eye vector about the normal, and used the reflected vector to look up the fragment’s color from the skybox texture.
• Submit your index.js on Crowdsource. Enter the eIDs for your team members. If you need to make changes after you’ve already submitted, just reload the page and resubmit. If you haven’t finished by the end of the scheduled lab time, you are free to continue working. However, the submission must be made before the end of the day to receive credit.