Let
\(A=\set{\vectorlist{x}{k}}\) be a basis of the subspace
\(U\cap V\text{.}\) Because
\(U\) and
\(V\) are each individually subspaces of
\(U+V\text{,}\) we can extend
\(A\) via repeated applications of
Theorem ELIS to form bases for
\(U\) and
\(V\text{.}\) To wit, let
\(\set{\vectorlist{u}{r}}\subseteq U\) be a set of vectors such that
\(C=\set{\vectorlist{x}{k},\,\vectorlist{u}{r}}\) is a basis for
\(U\text{.}\) Similarly, let
\(\set{\vectorlist{v}{s}}\subseteq V\) be a set of vectors such that
\(D=\set{\vectorlist{x}{k},\,\vectorlist{v}{s}}\) is a basis for
\(V\text{.}\) Note that we have implicitly determined (by
Definition D) that
\(\dimension{U\cap V}=k\text{,}\) \(\dimension{U}=k+r\text{,}\) and
\(\dimension{V}=k+s\text{.}\)
With this setup, we claim that
\begin{equation*}
B=\set{\vectorlist{x}{k},\,\vectorlist{u}{r},\,\vectorlist{v}{s}}
\end{equation*}
is a basis for the subspace \(U+V\text{.}\) We establish spanning first. Suppose that \(\vect{x}\in U+V\text{,}\) so \(\vect{x} = \vect{u}+\vect{v}\) where \(\vect{u}\in U\) and \(\vect{v}\in V\text{.}\) Since \(\vect{u}\in U\) we can express \(\vect{u}\) as a linear combination of the vectors in \(C\text{,}\) and since \(\vect{v}\in V\) we can express \(\vect{v}\) as a linear combination of the vectors in \(D\text{.}\) So
\begin{align*}
\vect{x} &= \vect{u} + \vect{v}\\
&=\left(\lincombo{a}{x}{k}+\lincombo{c}{u}{r}\right) + \\
&\quad\quad\left(\lincombo{b}{x}{k}+\lincombo{d}{v}{s}\right)\\
&=\left(a_1+b_1\right)\vect{x_1} + \left(a_2+b_2\right)\vect{x_2} +\left(a_3+b_3\right)\vect{x_3} + \cdots + \left(a_k+b_k\right)\vect{x}_k +\\
&\quad\quad\lincombo{c}{u}{r} + \lincombo{d}{v}{s}
\end{align*}
and we see that
\(\vect{x}\) is a linear combination of the vectors of
\(B\text{,}\) so
\(B\) spans
\(U+V\) by
Definition SSVS.
To establish linear independence, we begin with a relation of linear independence (
Definition RLD) on
\(B\text{,}\)
\begin{equation*}
\lincombo{a}{x}{k}+\lincombo{c}{u}{r} + \lincombo{d}{v}{s}=\zerovector\text{.}
\end{equation*}
We rearrange this equation, and give a name to the common vector that is the expression on either side of the equality.
\begin{align*}
\vect{w} &= \lincombo{a}{x}{k}+\lincombo{c}{u}{r}\\
&= - \lincombo{d}{v}{s}\text{.}
\end{align*}
From the first expression we see that \(\vect{w}\) is a linear combination of the basis \(C\) and so \(\vect{w}\in U\text{.}\) The second expression shows that \(\vect{w}\) is a linear combination of the basis \(D\) and so \(\vect{w}\in V\text{.}\) Thus, \(\vect{w}\in U\cap V\) and we can express \(\vect{w}\) as a linear combination of the vectors of \(A\text{.}\) We use this observation, and another expression for \(\vect{w}\) from just above, to form an equality that we then rearrange,
\begin{gather*}
\vect{w} = \lincombo{e}{x}{k} = -\lincombo{d}{v}{s}\\
\zerovector = \lincombo{e}{x}{k} + \lincombo{d}{v}{s}
\end{gather*}
Aha! Finally, we have a relation of linear dependence on a linearly independent set (
\(D\)) and so by
Definition LI we conclude that
\begin{equation*}
e_1=e_2=e_3=\cdots=e_k=d_1=d_2=d_3=\cdots=d_s=0
\end{equation*}
and in particular, \(\vect{w}=\zerovector\text{.}\)
Now, if we return to the equation where we first introduced \(\vect{w}\text{,}\) it becomes
\begin{equation*}
\zerovector = \vect{w} = \lincombo{a}{x}{k}+\lincombo{c}{u}{r}\text{.}
\end{equation*}
Now, this is a relation of linear dependence on a linearly independent set (
\(C\)) and so by
Definition LI we conclude that
\begin{equation*}
a_1=a_2=e_3=\cdots=a_k=c_1=c_2=c_3=\cdots=c_r=0
\end{equation*}
and we have determined that all of the scalars in our original relation of linear dependence on
\(B\) are zero, and so by
Definition LI, we see that
\(B\) is linearly independent.
Having established that \(B\) is a basis, we can say that \(\dimension{U+V}=k+r+s\text{.}\) So we have our result,
\begin{align*}
\dimension{U+V} &= k + r + s\\
&= \left(k + r\right) + \left(k + s\right) - k\\
&= \dimension{U} + \dimension{V} - \dimension{U\cap V}\text{.}
\end{align*}