Notes for April 16-18

 

Texture mapping

On Tuesday we showed how to put all of the pieces together to do texture mapping. There are a number of different parts to this:

  • Load the image from a server
  • Create a texture object
  • Build a mipmap
  • Pass the texture to the GPU
  • Add u,v attribute data to each vertex in your geometry
  • Declare uniform shader variable sampler2D uSampler
  • Declare vertex attribute vec2 aUV
  • In the vertex shader: Use aUV to define vUV
  • In the fragment shader: Do the texture look-up.
The attached code, which is the same as the code we developed in class, shader9.zip does all of this.

We briefly went over Lance Williams' brilliant Mipmap algorithm. I will put more details about that in the course notes over the coming days, but you will not be responsible for implementing it for your homework. You can just use the Mipmap library that is already built into WebGL.

Homework, due Thursday April 25, before classtime

Use texture mapping in your own way. As we discussed in class on Thursday, you can create images in PhotoShop or some other paint program, and then use those images as textures.

Following the example I put into the fragment shader, see whether you can animate the texture within the fragment shader.

Try using the texture for something other than varying color. See whether you can use it to vary the color or power of the specular reflection.

Try using it to vary the surface normal, in order to create various kinds of embossing effects.

For extra credit:

  1. Try texture mapping onto an object consisting of a bicubic surface patch.

  2. Try texture mapping onto an object consisting of multiple bicubic surface patches that join together to form a smooth surface with a continuous surface normal.