Section 5.2 The matrix of a linear operator
Recall that a linear transformation is referred to as a linear operator. Recall also that two matrices and are similar if there exists an invertible matrix such that and that similar matrices have a lot of properties in common. In particular, if is similar to then and have the same trace, determinant, and eigenvalues. One way to understand this is the realization that two matrices are similar if they are representations of the same operator, with respect to different bases.
Since the domain and codomain of a linear operator are the same, we can consider the matrix where and are the same ordered basis. This leads to the next definition.
The following result collects several useful properties of the -matrix of an operator. Most of these were already encountered for the matrix of a transformation, although not all were stated formally.
Theorem 5.2.2.
Example 5.2.3.
Solution.
We compute
We now need to write each of these in terms of the basis We can do this by working out how to write each polynomial above in terms of Or we can be systematic.
Let be the matrix whose columns are given by the coefficient representations of the polynomials in with respect to the standard basis For we need to solve the equation
for scalars But this is equivalent to the system
which, in turn, is equivalent to the matrix equation
that is, Thus,
Similarly, and Using the computer, we find:
xxxxxxxxxx
from sympy import Matrix, init_printing
init_printing()
P = Matrix(3,3,[1,0,2,-1,1,0,0,3,-1])
M = Matrix(3,3,[1,0,2,0,4,1,1,0,2])
P**-1, P**-1*M
That is,
Let’s confirm that this works. Suppose we have
Then and we find
On the other hand,
The results agree, but possibly leave us a little confused.
then
As we saw above, this gives us the result, but doesn’t shed much light on the problem, unless we have an easy way to write vectors in terms of the basis Let’s revisit the problem. Instead of using the given basis let’s use the standard basis We quickly find
so with respect to the standard basis, Now, recall that
so we get
Now we have a much more efficient method for arriving at the matrix The matrix is easy to determine, the matrix is easy to determine, and with the help of the computer, it’s easy to compute
xxxxxxxxxx
from sympy import Matrix, init_printing
init_printing()
M0 = Matrix(3,3,[1,0,0,1,1,1,1,0,0])
P**-1*M0*P
Exercise 5.2.4.
xxxxxxxxxx
The matrix used in the above examples is known as a change matrix. If the columns of are the coefficient vectors of with respect to another basis then we have
In other words, is the matrix of the identity transformation where we use the basis for the domain, and the basis for the codomain.
Definition 5.2.5.
Theorem 5.2.6.
Exercise 5.2.7.
Prove Theorem 5.2.6.
Hint.
The identity operator does nothing. Convince yourself amounts to taking the vectors in and writing them in terms of the vectors in
Example 5.2.8.
Solution.
Finding this matrix requires us to first write the vectors in in terms of the vectors in However, it’s much easier to do this the other way around. We easily find
and by Theorem 5.2.6, we have
Note that the change matrix notation is useful for linear transformations between different vector spaces as well. Recall Theorem 5.1.6, which gave the result
which seems more intiutive.
The above results give a straightforward procedure for determining the matrix of any operator, with respect to any basis, if we let be the standard basis. The importance of these results is not just their computational simplicity, however. The most important outcome of the above is that if and give the matrix of with respect to two different bases, then
so that the two matrices are similar.
Recall from Theorem 4.1.10 that similar matrices have the same determinant, trace, and eigenvalues. This means that we can unambiguously define the determinant and trace of an operator, and that we can compute eigenvalues of an operator using any matrix representation of that operator.
Exercises Exercises
1.
2.
You have attempted 1 of 5 activities on this page.