Write a Java method that
implements a camera view transform.
You should be able to pass it an eye point E and
an aim point A (each as a Vector3D object),
and the method should
return (or else fill in the value of) a Matrix3D object
which transforms points so that:
- the eye point E is transformed to the origin, and
- the aim point A is transformed to lie along the negative z axis.
As we went over in class, the math for
this consists of two stages:
- Create an orthonormal matrix that translates (0,0,0) to E,
and that transforms the z direction to
align with the vector from A to E,
- Invert the above matrix.
You can do the first step above by calculating:
Z' = normalize(E - A)
X' = normalize(Y × Z')
Y' = normalize(Z' × X')
in order to create the matrix:
X'x | Y'x | Z'x | Ex
|
X'y | Y'y | Z'y | Ey
|
X'z | Y'z | Z'z | Ez
|
0 | 0 | 0 | 1
|
The inverse of the above orthonormal matrix is then given by:
X'x | X'y
| X'z | -(E · X')
|
Y'x | Y'y
| Y'z | -(E · Y')
|
Z'x | Z'y
| Z'z | -(E · Z')
|
0 | 0 | 0 | 1
|