### Homework 11.

Assigned: Mon Nov 22. Due: MON DEC 6, at midnight
1. Exercises on p.218 of the text:
• Exercise 28.1
• Exercise 28.2
• Repeat Exercise 28.2 for the case that A is not symmetric, replacing "tridiagonal" by "Hessenberg"

2. Text, p.223-224, Exercise 29.1 (a Matlab programming assignment to compute the eigenvalues of a real symmetric matrix). Collect everything except the plotting commands inside a single function, say qreig (the "driver" of (c)) that calls tridiag once and qralg m-1 times. This function should have an input parameter that can be used to say whether or not shifts are to be used (this will have to be passed to qralg as well) and should output, along with the eigenvalues, the quantities needed to plot the "sawtooth" plot. Follow good software engineering practices: include plenty of comments, and make your plots outside the call to qreig.

Note, regarding the "sawtooth plot": You are being asked to plot the bottom subdiagonal element (in absolute value) as a function of the number of steps of the algorithm. This number keeps decreasing (usually) until convergence takes place. Once it has converged, the matrix deflates, and now the NEW bottom subdiagonal element is plotted: this starts bigger, and then starts to go down like the first one did. That's why the graph is "saw-toothed": the quantity plotted jumps up m-2 times and then goes down steadily each time. Using semilogy makes the y-axis logarithmic.

Don't forget part (e), commenting on the convergence rates you observe in your sawtooth plots. Also, compare the result of qreig with Matlab's eig both in terms of accuracy and time. Profile your code to see where the main cost is (type help profile), perhaps for a larger m.

In addition to the real symmetric codes, provide similar codes to compute the triangular factor T in the Schur decomposition QTQ* of a square nonsymmetric complex matrix. Do not worry about making things efficient in the real case (as LAPACK does). The code should be very similar to the code for the real symmetric case except that it does not enforce symmetry. For example, you will now be computing the Hessenberg factorization instead of the tridiagonal factorization, and you should find that the rates of convergence are noticeably different. Choose your own nonsymmetric test matrices (try something else in addition to random matrices.)

Notice you are NOT being asked to compute eigenvectors or Schur vectors. However, it is not much additional effort to accumulate the product of the orthogonal transformations defining the matrix Q in the factorization A = QDQ' or QTQ', where D is diagonal in the symmetric case and T is triangular in the general nonsymmetric case. Note that the columns of Q are eigenvectors in the symmetric case and Schur vectors in the general nonsymmetric case. If you want, you can write your codes so they return Q as well as D or T.

Important: although it is fine to consult with other students, as long as you acknowledge this in writing, you are EACH expected to write your OWN individually desinged code. This is not a group project. This is the final major homework assignment so please take the time to do a good job on this assignment. I may ask you for details on how this code works during the oral final exam.

Here are a couple more things that have come up.

For the QR alg for general A, remember that it is intended for general COMPLEX A. Thus, there is no reason to expect conjugate pairs. You can get a random complex matrix from rand(n)+i*rand(n) or just crand(n).

However, even if the matrix is real your method with shifts should still work since the Wilkinson shift should shift towards one conjugate eigenvalue or other.

A more clever implementation for real A WOULD take account of conjugate pairs and would use only real arithmetic, but this gets complicated.

Clearly, your unshifted implementation will fail on a real matrix since the assumption on the distinct eigenvalue moduli fails, but we know that the unshifted QR alg can always fail when this condition is satisfied, as you see in Ex 28.1. (By the way, in Ex 28.1 it seems to be implicit that the orthogonal matrix is symmetric, e.g. a Householder matrix, since the whole discussion, incl Thm 28.4, is about symmetric matrices; however, in class we proved a generalization for this theorem to the nonsymmetric case and the exact same question applies to that.)