We begin with a set \(S\) containing seven vectors from \(\complex{4}\)
\begin{equation*}
S=\set{
\colvector{1\\2\\0\\-1},\,
\colvector{4\\8\\0\\-4},\,
\colvector{0\\-1\\2\\2},\,
\colvector{-1\\3\\-3\\4},\,
\colvector{0\\9\\-4\\8},\,
\colvector{7\\-13\\12\\-31},\,
\colvector{-9\\7\\-8\\37}
}
\end{equation*}
and define \(W=\spn{S}\text{.}\)
The set
\(S\) is obviously linearly dependent by
Theorem MVSLD, since we have
\(n=7\) vectors from
\(\complex{4}\text{.}\) So we can slim down
\(S\) some, and still create
\(W\) as the span of a smaller set of vectors.
As a device for identifying relations of linear dependence among the vectors of \(S\text{,}\) we place the seven column vectors of \(S\) into a matrix as columns,
\begin{equation*}
A=\matrixcolumns{A}{7}=\begin{bmatrix}
1 & 4 & 0 & -1 & 0 & 7 & -9 \\
2 & 8 & -1 & 3 & 9 & -13 & 7\\
0 & 0 & 2 & -3 & -4 & 12 & -8\\
-1 & -4 & 2 & 4 & 8 & -31 & 37
\end{bmatrix}\text{.}
\end{equation*}
By
Theorem SLSLC a nontrivial solution to
\(\homosystem{A}\) will give us a nontrivial relation of linear dependence (
Definition RLDCV) on the columns of
\(A\) (which are the elements of the set
\(S\)). The row-reduced form for
\(A\) is the matrix
\begin{equation*}
B=\begin{bmatrix}
\leading{1} & 4 & 0 & 0 & 2 & 1 & -3\\
0 & 0 & \leading{1} & 0 & 1 & -3 & 5\\
0 & 0 & 0 & \leading{1} & 2 & -6 & 6\\
0 & 0 & 0 & 0 & 0 & 0 & 0
\end{bmatrix}
\end{equation*}
so we can easily create solutions to the homogeneous system
\(\homosystem{A}\) using the free variables
\(x_2,\,x_5,\,x_6,\,x_7\text{.}\) Any such solution will provide a relation of linear dependence on the columns of
\(B\text{.}\) These solutions will allow us to solve for one column vector as a linear combination of some others, in the spirit of
Theorem DLDS, and remove that vector from the set. We will set about forming these linear combinations methodically.
Set the free variable \(x_2=1\text{,}\) and set the other free variables to zero. Then a solution to \(\linearsystem{A}{\zerovector}\) and the resulting nontrivial relation of linear dependence are
\begin{align*}
\vect{x}&=\colvector{-4\\1\\0\\0\\0\\0\\0}
&
(-4)\vect{A}_1+
1\vect{A}_2+
0\vect{A}_3+
0\vect{A}_4+
0\vect{A}_5+
0\vect{A}_6+
0\vect{A}_7
&=\zerovector\text{.}
\end{align*}
This can then be arranged and solved for \(\vect{A}_2\text{,}\) resulting in \(\vect{A}_2\) expressed as a linear combination of \(\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4}\text{,}\) \(\vect{A}_2=
4\vect{A}_1+
0\vect{A}_3+
0\vect{A}_4\text{.}\)
This means that \(\vect{A}_2\) is surplus, and we can create \(W\) just as well with a smaller set with this vector removed, \(W=\spn{\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4,\,\vect{A}_5,\,\vect{A}_6,\,\vect{A}_7}}\text{.}\)
Technically, this set equality for
\(W\) requires a proof, in the spirit of
Example RSC5, but we will bypass this requirement here, and in the next few paragraphs.
Now, set the free variable \(x_5=1\text{,}\) and set the other free variables to zero. Then a solution to \(\linearsystem{A}{\zerovector}\) and the resulting nontrivial relation of linear dependence are
\begin{align*}
\vect{x}&=\colvector{-2\\0\\-1\\-2\\1\\0\\0}
&
(-2)\vect{A}_1+
0\vect{A}_2+
(-1)\vect{A}_3+
(-2)\vect{A}_4+
1\vect{A}_5+
0\vect{A}_6+
0\vect{A}_7
&=\zerovector\text{.}
\end{align*}
This can then be arranged and solved for \(\vect{A}_5\text{,}\) resulting in \(\vect{A}_5\) expressed as a linear combination of \(\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4}\text{,}\) \(\vect{A}_5=
2\vect{A}_1+
1\vect{A}_3+
2\vect{A}_4\text{.}\)
This means that \(\vect{A}_5\) is surplus, and we can create \(W\) just as well with a smaller set with this vector removed, \(W=\spn{\left\{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4,\,\vect{A}_6,\,\vect{A}_7\right\}}\text{.}\)
Do it again, set the free variable \(x_6=1\text{,}\) and set the other free variables to zero. Then a solution to \(\linearsystem{A}{\zerovector}\) and the resulting nontrivial relation of linear dependence are
\begin{align*}
\vect{x}&=\colvector{-1\\0\\3\\6\\0\\1\\0}
&
(-1)\vect{A}_1+
0\vect{A}_2+
3\vect{A}_3+
6\vect{A}_4+
0\vect{A}_5+
1\vect{A}_6+
0\vect{A}_7
&=\zerovector\text{.}
\end{align*}
This can then be arranged and solved for \(\vect{A}_6\text{,}\) resulting in \(\vect{A}_6\) expressed as a linear combination of \(\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4}\text{,}\) \(\vect{A}_6=1\vect{A}_1+ (-3)\vect{A}_3+ (-6)\vect{A}_4\text{.}\)
This means that \(\vect{A}_6\) is surplus, and we can create \(W\) just as well with a smaller set with this vector removed, \(W=\spn{\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4,\,\vect{A}_7}}\text{.}\)
Set the free variable \(x_7=1\text{,}\) and set the other free variables to zero. Then a solution to \(\linearsystem{A}{\zerovector}\) and the resulting nontrivial relation of linear dependence are
\begin{align*}
\vect{x}&=\colvector{3\\0\\-5\\-6\\0\\0\\1}
&
3\vect{A}_1+
0\vect{A}_2+
(-5)\vect{A}_3+
(-6)\vect{A}_4+
0\vect{A}_5+
0\vect{A}_6+
1\vect{A}_7
&=\zerovector\text{.}
\end{align*}
This can then be arranged and solved for \(\vect{A}_7\text{,}\) resulting in \(\vect{A}_7\) expressed as a linear combination of \(\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4}\text{,}\) \(\vect{A}_7=
(-3)\vect{A}_1+
5\vect{A}_3+
6\vect{A}_4\text{.}\)
This means that \(\vect{A}_7\) is surplus, and we can create \(W\) just as well with a smaller set with this vector removed, \(W=\spn{\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4}}\text{.}\)
You might think we could keep this up, but we have run out of free variables. And not coincidentally, the set
\(\set{\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4}\) is linearly independent (check this!). It should be clear how each free variable was used to eliminate a column from the set used to span the column space, as this will be the essence of the proof of the next theorem. The column vectors in
\(S\) were not chosen entirely at random, they are the columns of Archetype
I. See if you can mimic this example using the columns of Archetype
J. Go ahead, we’ll go grab a cup of coffee and be back before you finish up.
For extra credit, notice that the vector
\begin{equation*}
\vect{b}=\colvector{3\\9\\1\\4}
\end{equation*}
is the vector of constants in the definition of Archetype
I. Since the system
\(\linearsystem{A}{\vect{b}}\) is consistent, we know by
Theorem SLSLC that
\(\vect{b}\) is a linear combination of the columns of
\(A\text{,}\) or stated equivalently,
\(\vect{b}\in W\text{.}\) This means that
\(\vect{b}\) must also be a linear combination of just the three columns
\(\vect{A}_1,\,\vect{A}_3,\,\vect{A}_4\text{.}\) Can you find such a linear combination? Did you notice that there is just a single (unique) answer? Hmmmm.