|
Notes for September 16 class -- Introduction to Ray Tracing
WebGL Cheat Sheet
As I mentioned last week, there is a convenient
compact guide to WebGL.
The last page in particular is useful for writing shaders in OpenGL ES.
Note that OpenGL ES fragment shaders do not allow recursion.
| |
|
Gamma correction
Displays are adjusted for human perception using a "gamma curve",
since people can perceive a vary large range of brightness.
For example, the image to the right shows the values 0...255 on the horizontal
axis, the resulting actual displayed brightness on the virtual axis.
Various displays differ, but this adjustment is generally approximately x2
Since all of our calculations are really summing actual photons of light,
we need to do all our math in linear brightness, and then do a gamma correction
when we are all finished:
Roughly: c → sqrt(c)
| Output brightness |
Input values 0 ... 255
|
Ray tracing: Forming a ray
At each pixel, a ray from the origin V = (0,0,0) shoots into the scene, hitting a virtual screen which is on the z = -f plane.
We refer to f as the "focal length" of this virtual camera.
The ray toward any pixel aims at: (x, y, -f), where -1 ≤ x ≤ +1 and -1 ≤ y ≤ +1.
So the ray direction W = normalize( vec3(x, y, -f) ).
In order to render an image of a scene at any pixel,
we need to follow the ray at that pixel and see what object (if any)
the ray encounters first.
In other words, the nearest object at a point V + Wt, where t > 0.
| |
|
Ray tracing: Defining a sphere
We can describe a sphere in a GLSL vector of length 4:
vec4 sphere = vec4(x,y,z,r);
where (x,y,z) is the center of the sphere, and r is the sphere radius.
As we discussed in class, the components of a vec4 in GLSL can be accessed in one of
two ways:
v.x, v.y, v.z, v.w // when thought of as a 4D point
v.r, v.g, v.b, v.a // when thought of as a color + alpha
So to access the value of your sphere's radius in a fragment shader vec4 ,
you will want to refer to it as sphere.w (not sphere.r).
| |
|
Ray tracing: Finding where a ray hits a sphere
D = V - sph.xyz // vector from center of sphere to ray origin.
The point of doing this subtraction is that we are then effectively solving
the much easier problem of ray tracing to a sphere at the origin (0,0,0).
So instead of solving the more complex problem of tracing a ray V+Wt to a sphere
centered at sph.xyz,
we are instead solving the equivalent, but much simpler, problem of
tracing the ray D+Wt to a sphere centered at (0,0,0).
(D + Wt)2 = sph.r2 // find a point along ray that is distance r from sphere center.
(D + Wt)2 - sph.r2 = 0 // need to solve for t.
Generally, if a and b are vectors, then a • b = (
ax * bx +
ay * by +
az * bz )
This "inner product" also equals: |a| * |b| * cos(θ)
Multiplying the terms out, we need to solve the following quadratic equation:
(W • W) t2 +
2 (W • D) t +
(D • D) - r2 = 0
Since ray direction W is unit length, the first term in this equation (W • W) is just 1.
| |
|
Ray tracing: Finding the nearest intersection
Compute the intersection to all spheres in the scene.
The visible sphere at this pixel, if any, is the one with smallest positive t.
| |
|
Ray tracing: Computing the surface point
Once we know the value of t, we can just plug it into the ray equation to get the location of the surface point on the sphere that is visible at this ray, as shown in the equation below and in the figure to the right:
S = V + W t
| |
|
Ray tracing: Computing the surface normal
We now need to compute the surface normal in order to compute how the sphere interacts with light to produce a final color at this pixel.
The "surface normal" is the unit length vector that is perpendicular to the surface of the sphere -- the "up" direction if you are standing on the surface.
For a sphere, we can get this by subtracting the center of the sphere from the surface point S, and then dividing by the radius of the sphere:
N = (S - sph.xyz) / sph.r
| |
|
Ray tracing: A simple shader
A simple light source at infinity can be defined as an rgb color, together with an xyz light direction:
vec3 lightColor; // rgb
vec3 lightDirection; // xyz
A diffuse simple surface material can be defined by an rgb "ambient" component,
which is independent of any particular light source, and an rgb "diffuse" component,
which is added for each light source:
vec3 ambient; // rgb
vec3 diffuse; // rgb
The figure to the right shows a diffuse sphere, where the point at
each pixel is computed as:
ambient + lightColor * diffuse * max(0, normal • lightDirection)
| |
|
Ray tracing: Multiple light sources
If the scene contains more than one light source, then pixel color
for points on the sphere can be computed by:
ambient + ∑n (
lightColorn * diffuse * max(0, normal • lightDirectionn)
)
where n iterates over the lights in the scene.
|
At the end of the class we saw a video:
The Academy Award winning short Ryan
by Chris Landreth.
|
|
| |