CS 488: Lecture 13 – Texture Mapping
Dear students:
The temptation in computer graphics is to add more realism. We want the light to behave like real light, water to move like real water, and bodies to bend like real bodies. To make 3D shapes look more real, we might consider adding more triangles to give more geometric detail. More triangles mean less room on the GPU and slower rendering. We want to keep triangle counts as low as possible. Instead, we add geometric detail by applying textures, which are 2D images that have been pasted onto the 3D surface like wrapping paper. Today we explore how to add textures to our WebGL renderer.
To texture a surface, we must add the following steps to our renderer:
- Upload an image to the GPU.
- Associate vertices of the model with pixels in the texture.
- Look up a fragment’s texture color in the fragment shader.
Loading a Texture
One of the benefits of using JavaScript and WebGL for this course instead of C++ and OpenGL is that we get image parsing routines for free. The browser knows how to load JPEG, PNG, and many other common image formats. We feed a URL to the image we want to load to an instance of Image
, which is the structure behind the <img>
tag found in HTML. The loading happens in the background, but we can make it sequential using async
and await
:
async function loadTexture(url) {
const image = new Image();
image.src = url;
await image.decode();
}
After the image is fully loaded, we create a texture object on the GPU and upload the pixels. The code to accomplish this is a hodge podge of function calls that came out at different stages of OpenGL’s life. Backward compatibility was deemed more important than a clean, coherent API. In modern OpenGL, you can upload many textures to the GPU, but each one needs to be associated with a different texture unit. Each texture has a number of properties, including its dimensionality, size, and pixel format.
This code uploads an image and to an arbitrary texture unit and sets its pixel format:
async function loadTexture(url, textureUnit = gl.TEXTURE0) {
const image = new Image();
image.src = url;
await image.decode();
gl.activeTexture(textureUnit);
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
gl.generateMipmap(gl.TEXTURE_2D);
return texture;
}
There are many parameters on texImage2D
call that are worth explaining. Let’s walk through them:
- The
gl.TEXTURE_2D
seems a little redundant given that the function is namedtexImage2D
. But it’s not. This same function is used to upload textures that are part of what’s called a cube map.gl.TEXTURE_2D
signifies that this is a plain 2D image. - The
0
that we send for the second parameter is the mipmap level. Level 0 of a mipmapped texture is the full resolution image. Level 1 has half the dimensions. Level 2 has a quarter of the dimensions. And so on. Uploading multiple resolutions of an image will give you better performance because fewer texels means more more efficient caching. Mipmaps also give you some control over how textures look when they get really small. We could generate our own lower-resolution versions of the image, passing a different level for each call. However, thegenerateMipmap
function will automatically downsample for us. - The first
gl.RGBA
is the pixel format that you want the texture to have on the GPU. This particular value means there each pixel has red, green, blue, and alpha channels and that each channel is given a byte of storage. - The second
gl.RGBA
is the number of channels in the pixel data that you are uploading. It’s quite reasonable for this to be the same as the preceding internal pixel format. - The
gl.UNSIGNED_BYTE
is the size of each channel in the pixel data that you are uploading. - The
image
is the source of the image data. One can send in a variety of different sources, including anImage
as we have, or acanvas
element, or a typed array.
The MDN documentation explains these parameters more thoroughly.
In our asynchonous initialization
function, we load the texture with this code:
const checkerboard = loadTexture('checkerboard.png', gl.TEXTURE0);
We only need to hang on to the return value if we plan on modifying or deleting the texture later on.
Mapping Vertices to Texels
Pixels inside textures are called texels, or texture elements. Somehow we need to establish an associate between locations on the surface to texels in the texture. Vertices are our only “hooks” for defining data on the surface, which means that we’ll establish the association via vertex attributes. But what is the data for these new attributes?
The image is a 2D array. It can be be indexed via row and column indices. To associate vertex n with texel (50, 75), we could use code like this:
const texcoords = [
// ...
50, 75,
// ...
];
attributes.addAttribute('texcoords', nvertices, 2, texcoords);
If we ever change the image’s resolution, that (50, 75) is probably going to need to change too. Instead of passing exact integer indices, we pass in proportions. Suppose the image is 100 pixels wide and 100 pixels tall. Then we’d have these texture coordinates:
const texcoords = [
// ...
0.5, 0.75,
// ...
];
To slap a texture on an indexed quadrilateral, we use these texture coordinates:
const texcoords = [
0, 0,
1, 0,
0, 1,
1, 1,
];
What about more complex geometry? Uhh, it’s not easy. Generally texture coordinates are assigned to vertices in the 3D modeling program. Modelers call this process UV mapping, using U to refer to the texture’s horizontal axis and V the vertical.
Looking Up Texture Values
As with some of our other vertex attributes, we receive the texture coordinates in the vertex shader and then pass them off to the fragment shader to get interpolated:
// ...
in vec2 texcoords;
out vec2 ftexcoords;
void main() {
ftexcoords = texcoords;
}
Before we use the texture coordinates to fish out the color from the texture, we first visualize the coordinates themselves as a color in the fragment shader to make sure they look right:
// ...
in vec2 ftexcoords;
void main() {
// ...
fragmentColor = vec4(ftexcoords, 0.0, 1.0);
}
On a quadrilateral with the texture coordinates described in the previous section, we should see black, red, green, and yellow corners.
Only after we’ve confirmed valid coordinates do we use the coordinates to look up the texture color. The texture object is accessible in the fragment shader as a uniform
variable of type sampler2D
. We retrieve the color using the texture
function:
// ...
uniform sampler2D checkerboard;
in vec2 ftexcoords;
void main() {
// ...
fragmentColor = texture(checkerboard, ftexcoords);
}
Out-of-Range Coordinates
Earlier we used proportions in [0, 1] for our texture coordinates. What happens if we use coordinates outside this range? Let’s try these:
const textureCoordinates = [
0, 0,
2, 0,
0, 2,
2, 2,
];
When we render with these coordinates, the texture repeats. We explicitly achieve this default effect with these calls:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.REPEAT);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.REPEAT);
There may be occasions where repeating reveals a discontinuity between opposing edges in the texture. Some of our graphics editors have an option for making these textures seamless. We can also sometimes achieve a seamlessness by mirroring the texture coordinates as they exceed the [0, 1] range:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.MIRRORED_REPEAT);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.MIRRORED_REPEAT);
In other situations, we want to stop the texture coordinates from leaving the [0, 1] range. We clamp them so that any coordinate above 1 is forced back down to the 1, and any coordinate below 0 is forced back up to 0:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
When our coordinates exceed [0, 1], clamping produces a smear effect that is hard to imagine a good use for. However, clamping is very useful for removing interpolation artifacts that would appear at the edge of a texture with gl.REPEAT
.
Interpolation
The texture coordinates are essentially continuous as they are interpolated across the surface of a triangle. But the underlying image is not continuous. It’s made of discrete pixels. If we want to look up a texture color using texture coordinates, we need a scheme for turning the coordinates into row and column indices, which are integers. First we need to apply the coordinates, which are proportions, to the actual image resolution:
vec2 floatIndices = ftexcoords * vec2(textureWidth, textureHeight);
The applied texture coordinates will likely have fractional components. We need integer indices for our discrete texture. There are two options for going from those floats to ints. We could round, perhaps by adding 0.5 and truncating:
ivec2 intIndices = ivec2(int(floatIndices.x + 0.5), int(floatIndices.y + 0.5));
This scheme is called nearest neighbor interpolation. We don’t actually need to write any of this code to perform this interpolation. WebGL will automatically turn proportions into the nearest row and column indices if we set these texture parameters:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
When we render with nearest neighbor interpolation, we see harsh lines when we view the textured surface.
Our other option is to use the fractions to perform a weighted average of the four surrounding texture colors. This scheme is called bilinear interpolation, and WebGL will perform it automatically if we set these texture parameters:
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
The minification filter is applied when the texture is zoomed out and we are squeezing more than one texel inside a single pixel. The magnification filter is applied when the texture is zoomed in and a single texel cover more than one pixel.
TODO
Here’s your TODO list:
- Take your wellness day on Friday to do something life-giving. We won’t have lab, and I won’t be around for office hours.
See you next time.
P.S. It’s time for a haiku!
I know this place well
Just like the back of my hand
Where the map’s tattooed