Confirming that is almost immediate. We will use the computer below to compute the eigenvalues and eigenvectors of but it’s useful to attempt this at least once by hand. We have
so the eigenvalues are and which are both real, as expected.
Finding eigenvectors can seem trickier than with real numbers, mostly because it is no longer immediately apparent when one row or a matrix is a multiple of another. But we know that the rows of must be parallel for a matrix, which lets proceed nonetheless.
For we have
There are two ways one can proceed from here. We could use row operations to get to the reduced row-echelon form of If we take this approach, we multiply row 1 by and then take times the new row 1 and add it to row 2, to create a zero, and so on.
Easier is to realize that if we haven’t made a mistake calculating our eigenvalues, then the above matrix can’t be invertible, so there must be some nonzero vector in the kernel. If then we must have
when we multiply by the first row of This suggests that we take and to get as our first eigenvector. To make sure we’ve done things correctly, we multiply by the second row of
Success! Now we move onto the second eigenvalue.
For we get
If we attempt to read off the answer like last time, the first row of suggests the vector Checking the second row to confirm, we find:
as before.
Finally, we note that
so the two eigenvectors are orthogonal, as expected. We have
so our orthogonal matrix is
With a bit of effort, we can finally confirm that
as expected.