More raytracing

In order to render scenes, colors and lighting are important. In a very simple model of lighting, we can think of the world as a collection of objects (eg: spheres) and an infinitely far away sky.

Rendering the sky:

If you follow any ray out into the sky, you will find some color. Since the sky is infinitely far way, the ray origin V has no effect on this color. The only thing that matters is the ray direction W. A sky is defined as a mapping from ray direction W to color (r,g,b). For example, a sky which is black (rgb = (0,0,0)) directly below (ie: in the direction (0,-1,0)) and light blue (rgb = (½,½,1)) directly above could be modeled by:

t = ½ + ½ Wy
(r,g,b) = (½t, ½t, t)

The above will cause a transition from (0,0,0) to (½,½,1) as the ray direction varies from (0,-1,0) up to (0,1,0).

As we discussed in class, you can model reflecting objects by creating a new ray every time your ray hits a object, and continuing to follow the path of your new ray until finally some ray escapes out into the sky. A reflecting object can have some color, which means that some of the light will be absorbed by that object rather than reflecting off of it. In the following section, we describe how to ray trace to scenes consisting of reflecting objects.

Computing rays backwards:

For computational reasons, we need to do all our ray calculations backwards, starting from the eye point and tracing a ray out into the scene. Of course this is backwards from the way physics actually works: in the real world, the light originates from the world around us, and ends up in our eyes. The reason we do things backwards is that we only want to look at those rays of light that really make a difference to us as we look at the final picture. By starting backwards from our eye point, and tracing backwards along the light path from the eye point through each pixel in turn, we can guarantee that our computational work will be done only for those rays of light that affect the final result.

Bouncing a ray off a surface:

Whenever your ray strikes an object, it will hit at some surface point S. In the case of a sphere, you can compute the location of S by starting with the distance-along-ray parameter t and plugging this into the ray equation to get S = V + tW.

In order to figure out how this ray reflects off of the sphere, you will need the surface normal vector N. As we discussed in class, the surface normal vector is always the unit length (ie: normalized) vector in the direction of the gradient of the implicit function that defines the object surface.

In the case of a sphere, for example, if the sphere has center C and radius r, then its implicit function is given by:

(x - Cx)2 + (y - Cy)2 + (z - Cz)2 - r2

The vector-valued gradient of this function is just:

(x - Cx, y - Cy, z - Cz)

If we plug in S we get the vector S-C, which is a vector of length r, since S is on the sphere surface, so the distance from C to S is just r. This means that the unit length (ie: normalized) vector in this direction is simply N = (S-C)/r.

Once we have the incoming ray direction W and the surface normal N, we can, as we discussed in class, get the outgoing reflected ray direction R by:

R = W - 2(W ∙ N)N

We can create a new ray that starts from slightly outside the surface and heads into this new reflected direction. The V and W of such a ray are given by something like:

( S + R/1000 , R )

We can associate a color with every object, as some (r,g,b) value. We start at the eyepoint with no absorption (ie: (1,1,1)). Then, every time we make a ray bounce off of an object, we multiply this color vector by that surface's rgb-valued absorption vector. For example, if the first surface we encounter is pink (1,½,½) and the second surface is light blue (½,½,1), then the total light absorption afTer these two bounces will be their product (½,¼,½).

Eventually, we will produce a reflected ray that does not hit any of our scene's objects. When this happens we compute the sky color bhy evaluating the sky color function at that last ray's direction W. Since this color represents the total amount of light that was available from the sky in this direction, we need to multiply this color vector by the final final absorption vector. The result is the color that is displayed at that pixel.

Converting RGB from 0..1 to 0..255:

We have been doing all these computations in a color space where 0.0 is black and 1.0 is white. But our Java display substrate expects color values from 0..255. As we discussed in class, we need to perform a final step of replacing (r,g,b) by ((int)(255*r),(int)(255*g),(int)(255*b)). This is the value that will get packed into the image pixel.