Notes for April 23

 

Using bump map texture images

On Tuesday we showed what a bump map texture image looks like. Basically, the offset direction to the surface normal is encoded as a color, with the x,y,z offsets between -1 and +1 encoded as r,g,b values between 0 and 1.

In order to use a bump map texture image we need to create a texture that the fragment shader can use, as well as a sampler index for accessing the bump map texture image. If we are already using TEXTURE0 for an RGB texture, then we need to use TEXTURE0 + 1 for the bump map. This means we will be using two sampler indices, so our uSampler array will consist of the array [0, 1].

To actually use the sampled bump map, we need to first convert it from rgb values between 0 and 1 to xyz values between -1 and +1. Then we add it to our surface normal and renormalize:

  1. Change the range from 0..1 to -1..+1
  2. normal = renormalize(normal + offset)

All of the code we created in class on Tuesday is in shader10.zip.

Creating a bump map texture images

We next showed how to create a bump map texture image, given a image where brightness 0..1 represents height.

The key is to render the image onto a square, using a custom shader. Our custom shader samples the texture image at slight offsets in both u and v. From those offsets it creates a 3D vector, which is then converted into a color.

In this case we do not apply gamma correction (that sqrt(color) we are now used to), because this image is not going to be displayed directly. Rather, in this case we want the value 128 in each color component in our bump map texture image to represent about half the linear value between 0 and 255.

You can then capture the resulting image via a screenshot of that portion of the screen (on a MacBook you can use CMD-SHIFT-4). Then you need to resize the image so that both its width and its height are powers of two. This is needed for the mipmapping to work properly when you sample the texture image.

Rendering a bicubic surface patch

If you have defined the 16 values for, say, the X coordinate of a bicubic surface patch, as a 4x4 matrix Px, then you should precompute a cubic coefficients matrix:

Cx = BZ * Px * BZT
Similarly if you have 4x4x matrices Py and Pz storing the Y and Z coordinates of a bicubic surface patch, then you can precompute cubic coefficients matrices:
Cy = BZ * Py * BZT

Cz = BZ * Pz * BZT

As it happens, the transpose BZT of the Bezier matrix is the same as BZ itself, so in this case you wouldn't really need to use the transpose (but you would if this were a Hermite or B-spline matrix).

Now for any value of (u,v) you can just do the following:

x = dot(U, Cx.transform(V))
y = dot(U, Cy.transform(V))
z = dot(U, Cz.transform(V))

where U = [ u3, u2, u, 1] and V = [ v3, v2, v, 1].

So now you know how to compute the 3D point in space corresponding any given (u,v) within a Bezier bicubic surface patch.

If you loop over values of (u,v) between 0 and 1, you can use this method to render a bicubic surface patch as a triangle strip, just as we did for spheres and tori.

Homework

Because we are introducing so much new material today, the homework that was going to be due this Thursday, April 25, will instead be due next Tuesday, April 30.