We ended the last section with an important result. Exercise 2.2.17 showed that existence of an injective linear map is equivalent to , and that existence of a surjective linear map is equivalent to . It’s probably not surprising than that existence of a bijective linear map is equivalent to .
In a certain sense that we will now try to make preceise, vectors spaces of the same dimension are equivalent: they may look very different, but in fact, they contain exactly the same information, presented in different ways.
For any finite-dimensional vector spaces and , if any only if .
Strategy.
We again need to prove both directions of an “if and only if”. If an isomorphism exists, can you see how to use Exercise 2.2.17 to show the dimensions are equal?
If the dimensions are equal, you need to construct an isomorphism. Since and are finite-dimensional, you can choose a basis for each space. What can you say about the sizes of these bases? How can you use them to define a linear transformation? (You might want to remind yourself what Theorem 2.1.8 says.)
Proof.
If is a bijection, then it is both injective and surjective. Since is injective, , by Exercise 2.2.17. By this same exercise, since is surjective, we must have . It follows that .
Suppose now that . Then we can choose bases of , and of .Theorem 2.1.8 then guarantees the existence of a linear map such that for each . Repeating the arguments of Exercise 2.2.17 shows that is a bijection.
Buried in the theorem above is the following useful fact: an isomorphism takes any basis of to a basis of . Another remarkable result of the above theorem is that any two vector spaces of the same dimension are isomorphic! In particular, we have the following theorem.
Theorem 2.3.3 is a direct consequence of Theorem 2.3.2. But it’s useful to understand how it works in practice. Note that in the definition below, we use the term ordered basis. This just means that we fix the order in which the vectors in our basis are written.
Note that this is a well-defined map since every vector in can be written uniquely in terms of the basis . But also note that the ordering of the vectors in is important: changing the order changes the position of the coefficients in .
The coefficient isomorphism is especially useful when we want to analyze a linear map computationally. Suppose we’re given where are finite-dimensional. Let us choose bases of and of . The choice of these two bases determines scalars , such that
Recall that for any function , if is a bijection, then it has an inverse: a function that “undoes” the action of . That is, if , then , or in other words, — the composition is equal to the identity function on .
The same is true for composition in the other order: is the identity function on . One way of interpreting this is to observe that just as is the inverse of , so is the inverse of ; that is, .
Since linear transformations are a special type of function, the above is true for a linear transformation as well. But if we want to keep everything under the umbrella of linear algebra, there are two things we should check: that the composition of two linear transformations is another linear transformation, and that the inverse of a linear transformation is a linear transformation.
With this connection between linear maps (in general) and matrices, it can be worthwhile to pause and consider invertibility in the context of matrices. Recall that an matrix is invertible if there exists a matrix such that and .
The same definition can be made for linear maps. We’ve defined what it means for a map to be invertible as a function. In particular, we relied on the fact that any bijection has an inverse.
Note that the rules given in elementary linear algebra, for the relative sizes of matrices that can be multiplied, are simply a manifestation of the fact that to compose functions, the range of the first must be contained in the domain of the second.
Theorem 2.3.2 also tells us why we can only consider invertibility for square matrices: we know that invertible linear maps are only defined between spaces of equal dimension. In analogy with matrices, some texts will define a linear map to be invertible if there exists a linear map such that
We know that the composition of two linear transformations is a linear transformation, and that the composition of two bijections is a bijection. It follows that the composition of two isomorphisms is an isomorphism!
With this observation, one can show that the relation of isomorphism is an equivalence relation. Two finite-dimensional vector spaces belong to the same equivalence class if and only if they have the same dimension. Here, we see again the importance of dimension in linear algebra.
If you got that last exercise incorrect, consider the following: given and , we have . Since is an isomorphism, it has an inverse, which goes from to . This inverse can be expressed in terms of the inverses of and , but we’re going backwards, so we have to apply them in the opposite order!