Linear Algebra Done Right Ch.3 Exercises

20 Aug 2018

Exercises 3.A

(3) Suppose $T\in\lnmpsb(\mathbb{F}^n,\mathbb{F}^m)$. Then there exist scalars $A_{j,k}\in\mathbb{F}$ for $j=1,…,m$ and $k=1,…,n$ such that

$$ T(x_1,...,x_n)=\Big(\sum_{k=1}^nA_{1,k}x_k,...,\sum_{k=1}^nA_{m,k}x_k\Big) $$

Proof We proved this in proposition W.3.1. And we proved the more general case in proposition W.3.3. $\blacksquare$

(4) Suppose $T\in\lnmpsb(V,W)$ and $v_1,…,v_m$ is a list of vectors in $V$ such that $Tv_1,…,Tv_m$ is linearly independent in $W$. Then $v_1,…,v_m$ is linearly independent.

Proof Given that $0=\sum_{i=1}^ma_iv_i$, then proposition 3.11, p.57 gives that

$$ 0=T(0)=T\Big(\sum_{i=1}^ma_iv_i\Big)=\sum_{i=1}^ma_iTv_i $$

The linear independence of $Tv_1,…,Tv_m$ gives that $0=a_1=\dotsb=a_m$. $\blacksquare$

(5) Prove the assertion in 3.7.

Proof We want to show that $\linmap{V}{W}$ is a vector space. By 1.34, it suffices to show that $0\in\linmap{V}{W}$ and that $\linmap{V}{W}$ is closed under addition and scalar multiplication. Note that

$$\align{ 0(u+v) &= 0\tag{by definition of zero map} \\ &= 0+0\tag{by 1.19.additive identity} \\ &= 0u+0v\tag{by definition of zero map} }$$

for any $u,v\in V$. Similarly, for any $\lambda\in\wF$, we have

$$\align{ 0(\lambda v) &= 0\tag{by definition of zero map} \\ &= \lambda0\tag{by 1.30} \\ &= \lambda(0v)\tag{by definition of zero map} }$$

Hence $0\in\linmap{V}{W}$. And in proposition W.3.19, we showed that $\linmap{V}{W}$ is closed under addition and scalar multiplication. $\blacksquare$

(7) Every linear map from a $1$-dimensional vector space to itself is multiplication by some scalar. More precisely, if $\dim{V}=1$ and $T\in\lnmpsb(V,V)$, then there exists $\lambda\in\mathbb{F}$ such that $Tv=\lambda v$ for all $v\in V$.

Proof Let $u$ be any nonzero vector in $V$. Since a list of one nonzero vector is always linearly independent, then $u$ is a basis for $V$, by proposition 2.39, p.45. Hence every vector in $V$ is a scalar multiple of $u$. In particular, $Tu=\lambda u$ for some $\lambda \in\mathbb{F}$.

Let $v\in V$. Then $v=bu$ for some $b\in\mathbb{F}$. Hence

$$ Tv=T(bu)=b(Tu)=b(\lambda u)=\lambda (bu)=\lambda v\quad\blacksquare $$

(8) Give an example of a function $\varphi:\mathbb{R}^2\mapsto\mathbb{R}$ such that

$$ \varphi(\lambda v)=\lambda\varphi(v) $$

for all $\lambda\in\mathbb{R}$ and all $v\in\mathbb{R}^2$ but $\varphi$ is not linear.

Solution For $v=(x,y)\in\mathbb{R}^2$, define

$$ \varphi(v)=\varphi(x,y)\equiv(x^3+y^3)^{1/3} $$

Then

$$\begin{align*} \varphi(\lambda v)&=\varphi(\lambda x,\lambda y) \\ &=[(\lambda x)^3+(\lambda y)^3]^{1/3} \\ &=[\lambda^3x^3+\lambda^3y^3]^{1/3} \\ &=[\lambda^3(x^3+y^3)]^{1/3} \\ &=(\lambda^3)^{1/3}(x^3+y^3)^{1/3} \\ &=\lambda\varphi(x,y) \\ &=\lambda\varphi(v) \end{align*}$$

Note that $\varphi(1,0)=(1^3+0^3)^{1/3}=1^{1/3}=1$ and similarly $\varphi(0,1)=1$. But

$$\begin{align*} \varphi\big((1,0)+(0,1)\big)&=\varphi(1,1) \\ &=(1^3+1^3)^{1/3} \\ &=(1+1)^{1/3} \\ &=2^{1/3} \\ &\neq2 \\ &=\varphi(1,0)+\varphi(0,1)\quad\blacksquare \end{align*}$$

(10) Suppose $U$ is a subspace of $V$ with $U\neq V$. Suppose $S\in\lnmpsb(U,W)$ and $S\neq0$ (which means that $Su\neq0$ for some $u\in U$). Define $T:V\mapsto W$ as

$$ Tv=\begin{cases}Sv&v\in U\\0&v\in V\setminus U\end{cases} $$

Then $T$ isn’t linear.

Proof by contradiction Let $u\in U$ such that $Su\neq0$ and let $v\in V\setminus U$. Then $T(u+v)=Tu+Tv=Su+0=Su$. But $u+v\notin U$. If it were, then $v=(u+v)-u\in U$ since $U$ is a subspace. Hence $T(u+v)=0\neq Su=T(u+v)$. Contradiction. $\blacksquare$

(11) Suppose $V$ is finite-dimensional. Then every linear map on a subspace of $V$ can be extended to a linear map on $V$. That is, if $U$ is a subspace of $V$ and $S\in\lnmpsb(U,W)$, then there exists $T\in\lnmpsb(V,W)$ such that $Tu=Su$ for all $u\in U$.

Proof Let $u_1,…,u_m$ be a basis for $U$ and extend it to a basis $u_1,…,u_m,v_{m+1},…,v_n$ for $V$. Define $T\in\lnmpsb(V,W)$ as

$$ Tu_k=Su_k\quad\text{for }k=1,...,m\quad\quad\quad Tv_k=0\quad\text{for }k=m+1,...,n $$

Since $Su_1,…,Su_m,0\in W$ and $u_1,…,u_m,v_{m+1},…,v_n$ is a basis for $V$, then proposition 3.5, p.54 gives the existence of such a $T\in\lnmpsb(V,W)$. It suffices to show that $Tu=Su$ for every $u\in U$. Fix $u=\sum_{k=1}^ma_ku_k\in U$. Then

$$ Tu=T\Big(\sum_{k=1}^ma_ku_k\Big)=\sum_{k=1}^ma_kTu_k=\sum_{k=1}^ma_kSu_k=S\Big(\sum_{k=1}^ma_ku_k\Big)=Su\quad\blacksquare $$

Note Let’s compare this with the result from the previous problem (10). Define $\phi:V\mapsto W$ as

$$ \phi v=\begin{cases}Sv&v\in U\\0&v\in V\setminus U\end{cases} $$

Then $T\neq\phi$.

Proof & Intuition Proposition W.3.12 gives that there exists $u_k$ from the basis of $U$ such that $Su_k\neq0$. Fix this $u_k$ and fix $v\equiv u_k+v_{m+1}$. Then $v\in V\setminus U$. For if $v\notin V\setminus U$, then $v\in U$ and $v_{m+1}=v-u_k\in U$ since $U$ is a subspace. Contradiction since $v_{m+1}\notin\text{span}(u_1,…,u_m)=U$ and

$$ Tv=T(u_k+v_{m+1})=Tu_k+Tv_{m+1}=Su_k+0\neq0=\phi v $$

Let’s see what this looks like geometrically. Consider $V\equiv\mathbb{R}^2$ and

$$ U\equiv\big\{(x,y)\in\mathbb{R}^2:y=x\in\mathbb{R}\big\}=\big\{(x,x)\in\mathbb{R}^2:x\in\mathbb{R}\big\} $$

$U$ is a subspace since it contains the origin and

  • $(a,a),(b,b)\in U$ and $(a,a)+(b,b)=(a+b,a+b)\in U$
  • $\lambda(a,a)=(\lambda a,\lambda a)\in U$

Note that $(1,1)$ is a basis for $U$ since $(a,a)=a(1,1)$ and a single, nonzero vector is linearly independent. If we add any point in $\mathbb{R}^2\setminus U$ to $(1,1)$, then we get a point in $\mathbb{R}^2\setminus U$. Algebraically, $(1,1)+(x,y)=(1+x,1+y)\notin U$ if and only if $x\neq y$. Geometrically we can verify the point $(2,1)=(1,1)+(1,0)\notin U$.

From proposition 2.34, p.42, we can construct a subspace $X$ of $V$ such that $V=U\oplus X$. The proof shows us how to do it. The basis $(1,1)$ can be extended to a basis of $V$. We want a vector that is linearly independent with $(1,1)$. Since $(-1,1)$ is not in the span of $(1,1)$, then they are linearly independent. Then $(1,1),(-1,1)$ is a basis for $\mathbb{R}^2$ since $\dim{\mathbb{R}^2}=2$. Define $X\equiv\text{span}\big((-1,1)\big)$.

Notice that we performed this same procedure in problem (11) when we extended $u_1,…,u_m$ to $u_1,…,u_m,v_{m+1},…,v_n$.

Now consider the point $(0,2)\in V\setminus U$. We can write this point as a linear combination of the basis vector $(1,1)$ from $U$ and the basis vector $(-1,1)$ from $X$: $(0,2)=(1,1)+(-1,1)$.

Now define $W\equiv\mathbb{R}^3$. Proposition 3.5, p.54 allows us to define $S\in\lnmpsb(U,W)$ and $T\in\lnmpsb(V,W)$ as

$$ S(1,1)=(1,0,0) \quad\quad T(1,1)=S(1,1)=(1,0,0) \quad\quad T(-1,1)=(0,0,0) $$

Then $\phi(0,2)=0$ since $(0,2)\notin U$ and

$$ T(0,2)=T\big((1,1)+(-1,1)\big)=T(1,1)+T(-1,1)=(1,0,0)+(0,0,0)=(1,0,0)\neq0=\phi(0,2)\quad\blacksquare $$

(12) Suppose $V$ is finite-dimensional with $\dim{V}>0$, and suppose $W$ is infinite-dimensional. Then $\lnmpsb(V,W)$ is infinite-dimensional.

Proof Let $v_1,…,v_n$ be a basis for $V$. Let $w_1,w_2,…$ be a sequence of vectors in $W$ such that $w_1,…,w_m$ is linearly independent for every positive integer $m$. In our ch.2 notes, W.2.10, we proved the existence of such a sequence. For $i=1,2,…$, define $T_i\in\lnmpsb(V,W)$ as

$$\begin{align*} T_iv_k\equiv\begin{cases}w_i&k=1\\0&k>1\end{cases} \end{align*}$$

Since $w_i,0\in W$, then proposition 3.5, p.54 gives the existence of these linear maps $T_i$ for $i=1,2,…$. We want to show that $T_1,…,T_m$ is linearly independent for every positive integer $m$. For any scalars $a_1,…,a_m$ such that

$$\begin{align*} a_1T_1+\dots+a_mT_m=0 \end{align*}$$

this $a_1T_1+\dots+a_mT_m$ is the zero map and

$$\begin{align*} 0=(a_1T_1+\dots+a_mT_m)(v_1)=a_1T_1v_1+\dots+a_mT_mv_1=a_1w_1+\dots+a_mw_m \end{align*}$$

But $w_1,…,w_m$ is linearly independent. Hence $0=a_1=\dots=a_m$ and $T_1,…,T_m$ is linearly independent for every positive integer $m$. Again applying W.2.10, $\lnmpsb(V,W)$ is infinite-dimensional. $\blacksquare$

(13) Suppose $v_1,…,v_m$ is a linearly dependent list in $V$. Suppose also that $W\neq\{0\}$. Then there exist $w_1,…,w_m\in W$ such that no $T\in\lnmpsb(V,W)$ satisfies $Tv_k=w_k$ for $k=1,…,m$.

Proof Because $v_1,…v_m$ is linearly dependent, there exist $a_a,…,a_m\in\mathbb{F}$ not all zero such that

$$ a_1v_1+\dots+a_mv_m=0 $$

Let $a_1\neq0$ and define $w_1\neq0$. Such a $w_1\neq0$ exists since $W\neq\{0\}$. Define $w_i=0$ for $i\neq1$. Suppose there exists $T\in\lnmpsb(V,W)$ such that $Tv_k=w_k$ for $k=1,…,m$. Then

$$ 0=T(0)=T(a_1v_1+\dots+a_mv_m)=a_1w_1+\dots+a_mw_m=a_1w_1 $$

The first equality follows because linear maps take $0$ to $0$, 3.11, p.57. But $a_1w_1\neq0$ by construction. Contradiction and no such $T\in\lnmpsb(V,W)$ exists. $\blacksquare$

(14) Suppose $V$ is finite-dimensional with $\dim{V}\geq2$. There there exist $S,T\in\lnmpsb(V,V)$ such that $ST\neq TS$.

Proof Let $v_1,…,v_n$ be a basis of $V$ and define $S,T\in\lnmpsb(V,V)$ as

$$ Tv_k\equiv\begin{cases}v_2&k=1\\v_1&k=2\\v_k&\text{otherwise}\end{cases} \quad\quad\quad\quad Sv_k\equiv\begin{cases}2v_2&k=2\\v_k&\text{otherwise}\end{cases} $$

Proposition 3.5, p.54 gives the existences of $S$ and $T$ and we have

$$ STv_1=Sv_2=2v_2\neq v_2=Tv_1=TSv_1\quad\quad\blacksquare $$

Exercises 3.B

(1) Give an example of a linear map $T$ such that $\dim{\mathscr{N}(T)}=3$ and $\dim{\mathscr{R}(T)}=2$.

Solution Let $V$ be any $5$-dimensional vector space with basis $v_1,…,v_5$. Define $T\in\lnmpsb(V,V)$ by

$$ Tv_k\equiv\begin{cases}v_k&k=1,2\\0&k=3,4,5\end{cases} $$

Proposition 3.5, p.54 gives the existence of such a $T\in\lnmpsb(V,V)$. Let’s show that $\text{span}(v_3,v_4,v_5)=\mathscr{N}(T)$. Let $r\in\mathscr{N}(T)$. Then

$$ 0=Tr=T\Big(\sum_{k=1}^5r_kv_k\Big)=\sum_{k=1}^5r_kTv_k=r_1v_1+r_2v_2 $$

But the sublist of a linearly independent list is also linearly independent. Hence $0=r_1=r_2$. Hence

$$ r=\sum_{k=1}^5r_kv_k=r_3v_3+r_4v_4+r_5v_5\in\text{span}(v_3,v_4,v_5) $$

Conversely, suppose that $r\in\text{span}(v_3,v_4,v_5)$. Then $r\in\mathscr{N}(T)$ since

$$ Tr=T(r_3v_3+r_4v_4+r_5v_5)=r_3Tv_3+r_4Tv_4+r_5Tv_5=0 $$

Hence $\text{span}(v_3,v_4,v_5)=\mathscr{N}(T)$ and $\dim{\mathscr{N}(T)}=3$. Then the Fundamental Theorem of Linear Maps, p63 gives

$$ 5=\dim{V}=\dim{\mathscr{N}(T)}+\dim{\mathscr{R}(T)}=3+\dim{\mathscr{R}(T)} $$

Hence $\dim{\mathscr{R}(T)}=2$. $\blacksquare$

(3) Suppose $v_1,…,v_m$ is a list of vectors in $V$. Define $T\in\lnmpsb(\mathbb{F}^m,V)$ by

$$ T(z_1,...,z_m)=z_1v_1+\dots+z_mv_m $$

(a) What property of $T$ corresponds to $v_1,…v_m$ spanning $V$?

Solution Since $\text{span}(v_1,…,v_m)=\mathscr{R}(T)$, then $\text{span}(v_1,…,v_m)=V$ if and only if $\mathscr{R}(T)=V$. That is $\text{span}(v_1,…,v_m)=V$ if and only if $T$ is surjective.

(b) What property of $T$ corresponds to $v_1,…,v_m$ being linearly independent?

Solution $v_1,…,v_m$ is linearly independent if and only if

$$ 0=z_1v_1+\dots+z_mv_m=T(z_1,...,z_m) $$

implies that $0=z_1=\dots=z_m$. That is $v_1,…,v_m$ is linearly independent if and only if the only solution to $Tz=0$ is $z=0$. That is $v_1,…,v_m$ is linearly independent if and only if $\mathscr{N}(T)=\{0\}$. Hence $v_1,…,v_m$ is linearly independent if and only if $T$ is injective. $\blacksquare$

(4) The set

$$ U\equiv\{\Psi\in\lnmpsb(\mathbb{R}^5,\mathbb{R}^4):\dim{\mathscr{N}(\Psi)}>2\} $$

is not a subspace of $\lnmpsb(\mathbb{R}^5,\mathbb{R}^4)$.

Proof Let $v_1,…,v_5$ be a basis of $\mathbb{R}^5$ and let $u_1,…,u_4$ be a basis of $\mathbb{R}^4$. Define $S,T\in\lnmpsb(\mathbb{R}^5,\mathbb{R}^4)$ as

$$ Sv_k\equiv\begin{cases}0&k=1,2,3\\u_1&k=4\\u_2&k=5\end{cases} \quad\quad\quad Tv_k\equiv\begin{cases}0&k=1,2,4\\u_3&k=3\\u_4&k=5\end{cases} $$

Proposition 3.5, p.54 gives the existences of such $S,T\in\lnmpsb(\mathbb{R}^5,\mathbb{R}^4)$. Note that $\text{span}(v_1,v_2,v_3)=\mathscr{N}(S)$ and $\text{span}(v_1,v_2,v_4)=\mathscr{N}(T)$. We proved a similar result in problem (1) of this exercises 3.B. Hence $3=\dim{\mathscr{N}(S)}=\dim{\mathscr{N}(T)}$ and $S,T\in U$. But

$$\begin{matrix} (S+T)v_1=0+0=0\quad\quad\quad(S+T)v_2=0+0=0 \\\\ (S+T)v_3=0+u_3=u_3\quad\quad\quad(S+T)v_4=u_1+0=u_1\quad\quad\quad(S+T)v_5=u_2+u_4 \end{matrix}$$

Then $\text{span}(v_1,v_2)=\mathscr{N}(S+T)$ and $2=\dim{\mathscr{N}(S+T)}$. Hence $S+T\notin U$, $U$ is not closed under addition, and $U$ is not a subspace of $\lnmpsb(\mathbb{R}^5,\mathbb{R}^4)$. $\blacksquare$

(5) Give an example of $T\in\lnmpsb(\mathbb{R}^4,\mathbb{R}^4)$ such that $\mathscr{R}(T)=\mathscr{N}(T)$.

Solution Let $v_1,…,v_4$ be a basis of $\mathbb{R}^4$. By proposition 3.5, p.54, there exists $T\in\lnmpsb(\mathbb{R}^4,\mathbb{R}^4)$ such that

$$ Tv_k\equiv\begin{cases}v_3&k=1\\v_4&k=2\\0&k=3,4\end{cases} $$

Then $\mathscr{N}(T)=\text{span}(v_3,v_4)=\mathscr{R}(T)$. We proved a similar result to the first equality in problem (1) of this exercise set. Let’s prove the second equality $\text{span}(v_3,v_4)=\mathscr{R}(T)$. Let $r\in\mathscr{R}(T)$. Then there exists $x\in\mathbb{R}^4$ such that

$$ r=Tx=T(x_1v_1+x_2v_2+x_3v_3+x_4v_4)=x_1Tv_1+x_2Tv_2+x_3Tv_3+x_4Tv_4=x_1v_3+x_2v_4\in\text{span}(v_3,v_4) $$

Conversely, suppose $r\in\text{span}(v_3,v_4)$. Then there exists scalars $a,b\in\mathbb{R}$ such that

$$ r=av_3+bv_4=aTv_1+bTv_2=aTv_1+bTv_2+cTv_3+dTv_4=T(av_1+bv_2+cv_3+dv_4) $$

for any scalars $c,d\in\mathbb{R}$. Hence $r\in\mathscr{R}(T)$. $\blacksquare$

(7) Suppose $V$ and $W$ are finite-dimensional with $2\leq\dim{V}\leq\dim{W}$. Then the set

$$ U\equiv\{\Psi\in\lnmpsb(V,W):\Psi\text{ is not injective}\} $$

is not a subspace of $\lnmpsb(V,W)$.

Proof Let $v_1,…,v_n$ be a basis for $V$ and let $w_1,…,w_m$ be a basis for $W$ such that $2\leq n\leq m$. Define

$$ Sv_k\equiv\begin{cases}0&k=1\\w_2&k=2\\\frac12w_k&k=3,4,...,n\end{cases} \quad\quad\quad Tv_k\equiv\begin{cases}w_1&k=1\\0&k=2\\\frac12w_k&k=3,4,...,n\end{cases} $$

Proposition 3.5, p.54 gives the existence of $S,T\in\lnmpsb(V,W)$. And $S,T\in U$ since the null space of each contains a nonzero vector. But

$$ (S+T)v_1=0+w_1=w_1 \quad\quad (S+T)v_2=w_2+0=w_2 \quad\quad\dots\quad\quad (S+T)v_n=\frac12w_n+\frac12w_n=w_n $$

That is, $S+T$ maps a basis to a linearly independent list. Hence proposition W.3.9 in our ch.3 notes gives that $S+T$ is injective. Hence $S+T\notin U$, $U$ is not closed under addition, and $U$ is not a subspace of $\lnmpsb(V,W)$. $\blacksquare$

(8) Suppose $V$ and $W$ are finite-dimensional with $\dim{V}\geq\dim{W}\geq2$. Then the set

$$ U\equiv\{\Psi\in\lnmpsb(V,W):\Psi\text{ is not surjective}\} $$

is not a subspace of $\lnmpsb(V,W)$.

Proof Let $v_1,…,v_n$ be a basis for $V$ and let $w_1,…,w_m$ be a basis for $W$ such that $n\geq m\geq2$. Define

$$ Sv_k\equiv\begin{cases}0&k=1\\w_2&k=2\\\frac12w_k&k=3,4,...,m\\0&k=m+1,m+2,...,n\end{cases} \quad\quad\quad Tv_k\equiv\begin{cases}w_1&k=1\\0&k=2\\\frac12w_k&k=3,4,...,m\\0&k=m+1,m+2,...,n\end{cases} $$

Proposition 3.5, p.54 gives the existence of $S,T\in\lnmpsb(V,W)$. Proposition W.3.11 in our ch.3 notes gives that $S$ is surjective if and only if $Sv_1,…,Sv_n$ spans $W$. Note that $w_1\in W$ and $w_1,…,w_m$ are linearly independent. Hence

$$ w_1\notin\text{span}(0,w_2,\frac12w_3,...,\frac12w_m,0,...,0)=\text{span}(Sv_1,...,Sv_n) $$

Hence $S$ is not surjective and $S\in U$. Similarly $T\in U$. Now let’s look at $S+T$:

$$ (S+T)v_1=0+w_1=w_1 \quad\quad (S+T)v_2=w_2+0=w_2 \quad\quad\dots\quad\quad (S+T)v_3=\frac12w_3+\frac12w_3=w_3 $$

$$ \dots\quad\quad(S+T)v_m=\frac12w_m+\frac12w_m=w_m \quad\quad (S+T)v_{m+1}=0+0=0 \quad\quad\dots\quad\quad (S+T)v_n=0+0=0 $$

That is, $(S+T)v_1,…,(S+T)v_n=w_1,…,w_m,0,…,0$ spans $W$. Hence proposition W.3.11 gives that $S+T$ is surjective. Hence $S+T\notin U$, $U$ is not closed under addition, and $U$ is not a subspace of $\lnmpsb(V,W)$. $\blacksquare$

(9) Suppose $T\in\lnmpsb(V,W)$ is injective and $v_1,…,v_n$ is linearly independent in $V$. Then $Tv_1,…,Tv_n$ is linearly independent in $W$.

Proof See proposition W.3.8 in our ch.3 notes.

(10) Suppose $v_1,…,v_n$ spans $V$ and $T\in\lnmpsb(V,W)$. Then $Tv_1,…,Tv_n$ spans $\mathscr{R}(T)$.

Proof See proposition W.3.10 in our ch.3 notes.

(11) Suppose $S_1,S_2,…,S_n$ are injective linear maps such that $S_1S_2\dotsb S_n$ makes sense. Then $S_1S_2\dotsb S_n$ is injective.

Proof Let $v$ be a vector in the domain of $S_1\dotsb S_n$ (which equals the domain of $S_n$) such that $(S_1\dotsb S_n)v=0$. From definition 3.8, p.55 of linear map product (composition), we have

$$ 0=(S_1\dotsb S_n)v=S_1\big((S_2\dotsb S_n)v\big) $$

Since $S_1$ is injective, then this implies that $(S_2\dotsb S_n)v=0$. Hence

$$ 0=(S_2\dotsb S_n)v=S_2\big((S_3\dotsb S_n)v\big) $$

Since $S_2$ is injective, then this implies that $(S_3\dotsb S_n)v=0$… and so on until we reach

$$ 0=(S_{n-1}S_n)v=S_{n-1}(S_nv) $$

Since $S_{n-1}$ is injective, then this implies that $S_nv=0$. Since $S_n$ is injective, this implies that $v=0$. Hence $\mathscr{N}(S_1\dotsb S_n)=\{0\}$ and $S_1\dotsb S_n$ is injective. $\blacksquare$

(12) Suppose $V$ is finite-dimensional and $T\in\lnmpsb(V,W)$. Then there exists a subspace $U$ of $V$ such that $U\cap\mathscr{N}(T)=\{0\}$ and $\mathscr{R}(T)=\{Tu:u\in U\}$.

Proof Since $\mathscr{N}(T)$ is a subspace of $V$, proposition 2.34, p.42 gives that there exists a subspace $U$ of $V$ such that $V=\mathscr{N}(T)\oplus U$. Proposition 1.45, p.23 gives that $\mathscr{N}(T)\cap U=\{0\}$.

Let $\psi\in\{Tu:u\in U\}$. Then there exists $\mu\in U$ such that $\psi=T\mu$. Since $\mu\in U\subset V$, then $\psi\in\{Tv:v\in V\}$. Hence

$$ \{Tu:u\in U\}\subset\{Tv:v\in V\}\equiv\mathscr{R}(T) $$

In the other direction, let $r\in\mathscr{R}(T)$. There exists $v\in V$ such that $r=Tv$. Since $V=\mathscr{N}(T)\oplus U$, then $v=\phi+\mu$ for some $\phi\in\mathscr{N}(T)$ and some $\mu\in U$. Hence

$$ r=Tv=T(\phi+\mu)=T\phi+T\mu=T\mu\in\{Tu:u\in U\} $$

Hence $\mathscr{R}(T)\subset\{Tu:u\in U\}$. $\blacksquare$

(12+) Suppose $V$ is finite-dimensional and $T\in\lnmpsb(V,W)$. Then there exists a subspace $U$ of $V$ such that $U\cap\mathscr{N}(T)=\{0\}$ and $T\bar_U$ is injective.

Proof Notice that the condition $\mathscr{N}(T)\cap U=\{0\}$ implies that the only vector in $U$ that maps to zero is the zero vector. That is $\mathscr{N}(T\bar_U)=\{0\}$. $\blacksquare$

(13) Suppose $T$ is a linear map from $\mathbb{F}^4$ to $\mathbb{F}^2$ such that

$$ \mathscr{N}(T)=\big\{(x_1,x_2,x_3,x_4)\in\mathbb{F}^4:x_1=5x_2\text{ and }x_3=7x_4\big\} $$

Then $T$ is surjective.

Proof Let $(x_1,x_2,x_3,x_4)$ be any vector in $\mathscr{N}(T)$. Then

$$\begin{align*} (x_1,x_2,x_3,x_4) &= (5x_2,x_2,7x_4,x_4)\\ &= (5x_2,x_2,0,0)+(0,0,7x_4,x_4)\\ &= x_2(5,1,0,0)+x_4(0,0,7,1)\\ \end{align*}$$

Then $(5,1,0,0),(0,0,7,1)$ is a basis for $\mathscr{N}(T)$ and $\dim{\mathscr{N}(T)}=2$. Then the Fundamental Theorem of Linear Maps 3.22, p.63 gives

$$ \dim{\mathscr{R}(T)}=\dim{\mathbb{F}^4}-\dim{\mathscr{N}(T)}=4-2=2=\dim{\mathbb{F}^2} $$

Since $\mathscr{R}(T)$ is a subspace of $\mathbb{F}^2$, then proposition W.2.12 in our ch.2 notes gives $\mathscr{R}(T)=\mathbb{F}^2$ and $T$ is surjective. $\blacksquare$

(14) Suppose $U$ is a $3$-dimensional subsapce of $\mathbb{R}^8$ and $T$ is a linear map from $\mathbb{R}^8$ to $\mathbb{R}^5$ such that $\mathscr{N}(T)=U$. Then $T$ is surjective.

Solution The Fundamental Theorem of Linear Maps 3.22, p.63 gives

$$ \dim{\mathscr{R}(T)}=\dim{\mathbb{R}^8}-\dim{\mathscr{N}(T)}=8-3=5=\dim{\mathbb{R}^5} $$

Since $\mathscr{R}(T)$ is a subspace of $\mathbb{R}^5$, then proposition W.2.12 in our ch.2 notes gives $\mathscr{R}(T)=\mathbb{R}^5$ and $T$ is surjective. $\blacksquare$

(15) There doesn’t exist a linear map from $\mathbb{F}^5$ to $\mathbb{F}^2$ whose null space equals

$$ U\equiv\big\{(x_1,x_2,x_3,x_4,x_5)\in\mathbb{F}^5:x_1=3x_2\text{ and }x_3=x_4=x_5\big\} $$

Proof Let $(x_1,x_2,x_3,x_4,x_5)$ be any vector in $U$. Then

$$\begin{align*} (x_1,x_2,x_3,x_4,x_5) &= (3x_2,x_2,x_5,x_5,x_5)\\ &= (3x_2,x_2,0,0,0)+(0,0,x_5,x_5,x_5)\\ &= x_2(3,1,0,0,0)+x_5(0,0,1,1,1)\\ \end{align*}$$

Then $(3,1,0,0,0),(0,0,1,1,1)$ is a basis for $U$ and $\dim{U}=2$. For any $T\in\lnmpsb(\mathbb{F}^5,\mathbb{F}^2)$, the Fundamental Theorem of Linear Maps 3.22, p.63 gives

$$ \dim{\mathscr{N}(T)}=\dim{\mathbb{F}^5}-\dim{\mathscr{R}(T)}=5-\dim{\mathscr{R}(T)}\geq3>\dim{U} $$

The first inequality holds because $\mathscr{R}(T)\subset\mathbb{F}^2$. Proposition W.2.12 in our ch.2 notes gives $\mathscr{N}(T)\neq U$. $\blacksquare$

(16) Suppose there exists a linear map $T$ on $V$ whose null space and range are both finite dimensional. Then $V$ is finite-dimensional.

Proof The hypothesis of the Fundamental Theorem of Linear Maps 3.22, p.63 is that $V$ is finite-dimensional. That assumption isn’t given here. So we can’t use that theorem.

Let $u_1,…,u_m$ span $\mathscr{N}(T)$ and let $w_1,…,w_n$ span $\mathscr{R}(T)$. Since $w_j\in\mathscr{R}(T)$, there exists $v_j\in V$ such that $w_j=Tv_j$.

Suppose $v\in V$. Let’s write $v$ as a linear combination of a list of vectors in $V$. Since $Tv\in\mathscr{R}(T)$, there exist $b_1,…,b_n\in\mathbb{F}$ such that

$$\begin{align*} Tv&=\sum_{j=1}^nb_jw_j=\sum_{j=1}^nb_jTv_j=T\Big(\sum_{j=1}^nb_jv_j\Big) \end{align*}$$

This is equivalent to $0=T(v-\sum_{j=1}^nb_jv_j)$. Hence $v-\sum_{j=1}^nb_jv_j\in\mathscr{N}(T)$ and there exist scalars $a_1,…,a_m$ such that

$$\begin{align*} v-\sum_{j=1}^nb_jv_j=\sum_{k=1}^ma_ku_k \end{align*}$$

Equivalently

$$\begin{align*} v=\sum_{k=1}^ma_ku_k+\sum_{j=1}^nb_jv_j \end{align*}$$

Hence we can write an arbitrary vector $v\in V$ as a linear combination of $u_1,…,u_m,v_1,…,v_n$. Hence $u_1,…,u_m,v_1,…,v_n$ spans $V$. By definition 2.10, p.30, $V$ is finite-dimensional. $\blacksquare$

(17) Suppose $V$ and $W$ are both finite-dimensional. Then there exists an injective linear map from $V$ to $W$ if and only if $\dim{V}\leq\dim{W}$.

Proof Proposition 3.16, p.61 gives that $T\in\lnmpsb(V,W)$ is injective if and only if $\mathscr{N}(T)=\{0\}$. Then the Fundamental Theorem of Linear Maps 3.22, p.63 gives

$$ \dim{V}=\dim{\mathscr{N}(T)}+\dim{\mathscr{R}(T)}=\dim{\mathscr{R}(T)}\leq\dim{W} $$

Conversely, suppose $n=\dim{V}\leq\dim{W}=m$. Let $v_1,…,v_n$ and $w_1,…,w_m$ be bases for $V$ and $W$. Define $T\in\lnmpsb(V,W)$ as $Tv_k=w_k$ for $k=1,…,n$. This is well-defined because $n\leq m$ so that the $w_1,…,w_n$ all exist and Proposition 3.5, p.54 gives the existence of $T$. Proposition W.3.9 in our ch.3 notes gives that $T$ is injective since $Tv_1,…,Tv_n=w_1,…,w_n$ is linearly independent. $\blacksquare$

(18) Suppose $V$ and $W$ are both finite-dimensional. There there exists a surjective linear map from $V$ to $W$ if and only if $\dim{V}\geq\dim{W}$.

We offer $2$ different proofs.

Proof 1 Suppose $T\in\lnmpsb(V,W)$ is surjective. Then $\mathscr{R}(T)=W$ and proposition W.2.12 gives $\dim{\mathscr{R}(T)}=\dim{W}$. Hence

$$ \dim{W}=\dim{\mathscr{R}(T)}=\dim{V}-\dim{\mathscr{N}(T)}\leq\dim{V} $$

Conversely, suppose $\dim{W}\leq\dim{V}$. Let $v_1,…,v_n$ be a basis for $V$ and let $w_1,…,w_m$ be a basis for $W$ such that $m=\dim{W}\leq\dim{V}=n$. Then proposition 3.5, p.54 gives a unique $T\in\lnmpsb(V,W)$ such that

$$ Tv_k=\begin{cases}w_k&k=1,...,m\\0&k=m+1,...,n\end{cases} $$

Note that $Tv_1,…,Tv_n=w_1,…,w_m,0,…,0$ spans $W$. Then proposition $W.3.11$ in our ch.3 notes gives that $T$ is surjective. $\blacksquare$

Proof 2 Suppose $T\in\lnmpsb(V,W)$ is surjective. Then $\mathscr{R}(T)=W$ and proposition W.2.12 gives $\dim{\mathscr{R}(T)}=\dim{W}$. Hence

$$ \dim{W}=\dim{\mathscr{R}(T)}=\dim{V}-\dim{\mathscr{N}(T)}\leq\dim{V} $$

Conversely, suppose $\dim{W}\leq\dim{V}$. Let $v_1,…,v_n$ be a basis for $V$, let $w_1,…,w_m$ be a basis for $W$, and define $w_0\equiv0\in W$. For $v=\sum_{k=1}^na_kv_k\in V$, define $T$ as

$$ Tv=T\Big(\sum_{k=1}^na_kv_k\Big)\equiv\sum_{k=1}^ma_kw_k+\sum_{k=m+1}^na_kw_0=\sum_{k=1}^ma_kw_k $$

Note that $m=\dim{W}\leq\dim{V}=n$ so that the $a_m,a_{m+1},…,a_n$ on the right side make sense (if $m$ was bigger than $n$, then these wouldn’t make sense). Proposition 3.5, p.54 gives the existence of such a $T\in\lnmpsb(V,W)$. In the proof of that proposition, the unique linear map is defined just as we have defined it here.

Let $w=\sum_{k=1}^mc_kw_k$ be any vector in $W$. Then for any scalars $c_{m+1},…,c_n$, we have

$$ w=\sum_{k=1}^mc_kw_k=\sum_{k=1}^mc_kw_k+\sum_{k=m+1}^nc_kw_0=T\Big(\sum_{k=1}^nc_kv_k\Big)\in\mathscr{R}(T)\quad\blacksquare $$

Proof 2 Original Suppose $T\in\lnmpsb(V,W)$ is surjective. Then $\mathscr{R}(T)=W$ and proposition W.2.12 gives $\dim{\mathscr{R}(T)}=\dim{W}$. Hence

$$ \dim{W}=\dim{\mathscr{R}(T)}=\dim{V}-\dim{\mathscr{N}(T)}\leq\dim{V} $$

Conversely, suppose $\dim{W}\leq\dim{V}$. Let $v_1,…,v_n$ be a basis for $V$ and let $w_1,…,w_m$ be a basis for $W$. For $v=\sum_{k=1}^na_kv_k\in V$, define $T$ as

$$ Tv=T\Big(\sum_{k=1}^na_kv_k\Big)\equiv\sum_{k=1}^ma_kw_k $$

Note that $m=\dim{W}\leq\dim{V}=n$ so that the $a_1,…,a_m$ and $w_1,…,w_m$ on the right side make sense (if $m$ was bigger than $n$, then these wouldn’t make sense). Notice that the scalars $a_{m+1},…,a_n$ from the left side get lopped off and don’t appear on the right side. Hence these scalars are irrelevant to the value that $Tv$ takes in $W$ and

$$ T\Big(\sum_{k=1}^ma_kv_k\Big)=\sum_{k=1}^ma_kw_k $$

Another way to see this: let $b_{m+1},…,b_n$ be any scalars. Note that

$$ T\Big(\sum_{k=1}^ma_kv_k+\sum_{k=m+1}^nb_kv_k\Big)=\sum_{k=1}^ma_kw_k=T\Big(\sum_{k=1}^na_kv_k\Big) $$

Then taking $0=b_{m+1}=\dotsb=b_n$, we again arrive at $T\big(\sum_{k=1}^ma_kv_k\big)=\sum_{k=1}^ma_kw_k$. Proposition 3.5, p.54 gives the existence of such a $T\in\lnmpsb(U,W)\subseteq\lnmpsb(V,W)$ where

$$ U\equiv\text{span}(v_1,...,v_m)\subseteq\text{span}(v_1,...,v_n)=V $$

In the proof of that proposition, the unique linear map is defined just as we have defined $T\in\lnmpsb(U,W)$.

Let $w=\sum_{k=1}^mc_kw_k$ be any vector in $W$. Then

$$ w=\sum_{k=1}^mc_kw_k=T\Big(\sum_{k=1}^mc_kv_k\Big)\in\mathscr{R}(T)\quad\blacksquare $$

(19) Suppose $V$ and $W$ are finite-dimensional and $U$ is a subspace of $V$. Then there exists $T\in\lnmpsb(V,W)$ such that $\mathscr{N}(T)=U$ if and only if $\dim{U}\geq\dim{V}-\dim{W}$.

For example, let $V=\mathbb{R}^3$, $W=\mathbb{R}^1$, and let $U=\text{span}(v_3)$ where $v_3=(0,0,1)$ is the third standard basis vector. Then

$$ \dim{U}=1<2=3-1=\dim{V}-\dim{W} $$

and there is no $T\in\lnmpsb(\mathbb{R}^3,\mathbb{R}^1)$ such that $\mathscr{N}(T)=\text{span}(v_3)$. But if we redefine $W=\mathbb{R}^2$, then we can define such a $T\in\lnmpsb(\mathbb{R}^3,\mathbb{R}^2)$ as

$$ Tv_1=w_1\quad\quad\quad Tv_2=w_2\quad\quad\quad Tv_3=0 $$

and proposition 3.5, p.54 gives that $T$ is a unique linear map. Let $u=cv_3\in U=\text{span}(v_3)$. Then $Tu=cTv_3=0$. And let $v=av_1+bv_2\in V\setminus U$. Then $Tv=aTv_1+bTv_2=aw_1+bw_2$. Then $Tv$ doesn’t equal zero unless $0=a=b$ so $v=0$. Thus $\mathscr{N}(T)=U$. Let’s try the same thing with $W=\mathbb{R}^1$:

$$ Tv_1=w_1\quad\quad\quad Tv_2=w_1\quad\quad\quad Tv_3=0 $$

For $v=av_1-av_2\in\text{span}(v_1,v_2)$, we get $Tv=aTv_1-aTv_2=aw_1-aw_1=0$. Hence $v\in\mathscr{N}(T)$ but $v=av_1-av_2\notin U$. Hence $U\neq\mathscr{N}(T)$.

Proof Suppose there exists $T\in\lnmpsb(V,W)$ such that $\mathscr{N}(T)=U$. Then

$$ \dim{U}=\dim{\mathscr{N}(T)}=\dim{V}-\dim{\mathscr{R}(T)}\geq\dim{V}-\dim{W} $$

Conversely, suppose $\dim{U}\geq\dim{V}-\dim{W}$. Let $u_1,…,u_m$ be a basis for $U$ and extend it to a basis $u_1,…,u_m,v_1,…,v_n$ for $V$. Let $w_1,…,w_p$ be a basis for $W$. Then for scalars $a_1,…,a_m,b_1,…,b_n$, define

$$ T\Big(\sum_{k=1}^ma_ku_k+\sum_{j=1}^nb_jv_j\Big)\equiv\sum_{k=1}^ma_kw_0+\sum_{j=1}^nb_jw_j=\sum_{j=1}^nb_jw_j $$

where we define $w_0\equiv0\in W$. Note that $w_n$ on the right side makes sense because

$$ m=\dim{U}\geq\dim{V}-\dim{W}=m+n-p\iff0\geq n-p\iff p\geq n $$

Since $w_0,w_1,…,w_n\in W$ and $u_1,…,u_m,v_1,…,v_n$ is a basis for $V$, then proposition 3.5, p.54 gives that $T$ is linear - the proof of that proposition defines $T$ just like this.

Then it suffices to show that $\mathscr{N}(T)=U$. First suppose $r\in\mathscr{N}(T)$. Since $r\in V$, then for some scalars $c_1,…,c_m,d_1,…,d_n$, we have

$$ r=\sum_{k=1}^mc_ku_k+\sum_{j=1}^nd_jv_j $$

and

$$ 0=Tr=T\Big(\sum_{k=1}^mc_ku_k+\sum_{j=1}^nd_jv_j\Big)=\sum_{j=1}^nd_jw_j $$

But $w_1,…,w_n$ is linearly independent. Hence $0=d_1=\dotsb=d_n$ and $r=\sum_{k=1}^mc_ku_k\in U$.

Conversely suppose $u\in U$. Then for some scalars $\phi_1,…,\phi_m$, we have $u=\sum_{k=1}^m\phi_ku_k$ and

$$ Tu=T\Big(\sum_{k=1}^m\phi_ku_k\Big)=T\Big(\sum_{k=1}^m\phi_ku_k+\sum_{j=1}^n0v_j\Big)=\sum_{j=1}^n0w_j=0\quad\blacksquare $$

(20) Suppose $W$ is finite-dimensional and $T\in\lnmpsb(V,W)$. Then $T$ is injective if and only if there exists $S\in\lnmpsb(W,V)$ such that $ST$ is the identity map on $V$.

Proof Suppose $T$ is injective and define $S’:\mathscr{R}(T)\mapsto V$ as

$$ S'(Tv)\equiv v $$

The injectivity of $T$ gives that each element in $\mathscr{R}(T)$ can be represented in the form $Tv$ for exactly one $v\in V$. For if $w=Tv$ and $w=Tu$, then $Tv=w=Tu$ and $T$’s injectivity implies that $v=u$.

Hence $S’$ is well-defined. If it were the case that $u\neq v$ and $Tu=Tv$, then $u=S’(Tu)=S’(Tv)=v$, contradiction.

Note that $S’$ is linear. Suppose $w_1,w_2\in\mathscr{R}(T)$. Then $w_1=Tv_1$ for some $v_1\in V$ hence $S’w_1=S’(Tv_1)=v_1$. And $w_2=Tv_2$ for some $v_2\in V$ hence $S’w_2=S’(Tv_2)=v_2$. Note that $v_1+v_2\in V$ hence $T(v_1+v_2)\in\mathscr{R}(T)$ and

$$ S'(w_1+w_2)=S'(Tv_1+Tv_2)=S'\big(T(v_1+v_2)\big)=v_1+v_2=S'w_1+S'w_2 $$

In problem (11) from Exercises 3.A, we showed that $S’$ can be extended to $S\in\lnmpsb(W,V)$ such that $S=S’$ on $\mathscr{R}(T)$. Then, for $v\in V$, we get

$$ (ST)v=S(Tv)=S'(Tv)=v $$

Conversely, suppose there exists $S\in\lnmpsb(W,V)$ such that $ST$ is the identity map on $V$. If $u,v\in V$ are such that $Tu=Tv$, then

$$ u=(ST)u=S(Tu)=S(Tv)=(ST)v=v\quad\blacksquare $$

(21) Suppose $V$ is finite-dimensional and $T\in\lnmpsb(V,W)$. Then $T$ is surjective if and only if there exists $S\in\lnmpsb(W,V)$ such that $TS$ is the identity map on $W$.

Proof Suppose $T$ is surjective so that $\mathscr{R}(T)=W$. By the Fundamental Theorem of Linear Maps, the subspace $\mathscr{R}(T)=W$ is finite-dimensional. Let $w_1,…,w_n$ be a basis for $W$. Since $T$ is surjective, there exist $v_k\in V$ such that $Tv_k=w_k$ for $k=1,…,n$. Proposition 3.5, p.54 gives the existence of a unique $S\in\lnmpsb(W,V)$ such that

$$ Sw_k=v_k\quad\text{for }k=1,...,n $$

Then for any $w=\sum_k^na_kw_k\in W$, we have

$$ (TS)w=T(Sw)=T\Big(S\Big[\sum_{k=1}^na_kw_k\Big]\Big)=T\Big(\sum_{k=1}^na_kSw_k\Big)=T\Big(\sum_{k=1}^na_kv_k\Big)=\sum_{k=1}^na_kTv_k=\sum_{k=1}^na_kw_k=w $$

Conversely, suppose there exists $S\in\lnmpsb(W,V)$ such that $TS$ is the identity map on $W$. For any $w\in W$, we have

$$ w=(TS)w=T(Sw) $$

Since $Sw\in V$, then $w\in\mathscr{R}(T)$ and $W=\mathscr{R}(T)$. Hence $T$ is surjective. $\blacksquare$

(22) Suppose $U$ and $V$ are finite-dimensional vector spaces and $S\in\lnmpsb(V,W)$ and $T\in\lnmpsb(U,V)$. Then

$$ \dim{\mathscr{N}(ST)}\leq\dim{\mathscr{N}(S)}+\dim{\mathscr{N}(T)} $$

Proof Note that $ST:U\mapsto W$. Hence $\mathscr{N}(ST)\subset U$. Define $T’:\mathscr{N}(ST)\mapsto V$ as

$$ T'u=Tu\quad\text{for }u\in\mathscr{N}(ST) $$

Then $T’$ is linear: Let $u_1,u_2\in\mathscr{N}(ST)$. Since $\mathscr{N}(ST)$ is a subspace, then $u_1+u_2\in\mathscr{N}(ST)$ and

$$ T'(u_1+u_2)=T(u_1+u_2)=Tu_1+Tu_2=T'u_1+T'u_2 $$

Note that $\mathscr{N}(ST)\subset U$ and we’re given that $U$ is finite-dimensional. Hence $\mathscr{N}(ST)$ is finite-dimensional. We also showed that $T’\in\lnmpsb(\mathscr{N}(ST),V)$. So we’re justified in applying the Fundamental Theorem of Linear Maps to $T’$:

$$\begin{align*} \dim{\mathscr{N}(ST)}&=\dim{\mathscr{N}(T')}+\dim{\mathscr{R}(T')}\\ &\leq\dim{\mathscr{N}(T)}+\dim{\mathscr{R}(T')}\\ &\leq\dim{\mathscr{N}(T)}+\dim{\mathscr{N}(S)}\\ \end{align*}$$

The first inequality holds because $\mathscr{N}(T’)\subset\mathscr{N}(T)$: Let $r\in\mathscr{N}(T’)$. Then $0=T’r=Tr$ so $r\in\mathscr{N}(T)$.

The second inequality holds because $\mathscr{R}(T’)\subset\mathscr{N}(S)$: Let $t\in\mathscr{R}(T’)$. Then there exists $u\in\mathscr{N}(ST)$ such that $t=T’u$ and

$$ St=S(T'u)=S(Tu)=(ST)u=0 $$

Note that $St$ makes sense because $t\in\mathscr{R}(T’)\subset V$. And note that $(ST)u=0$ since $u\in\mathscr{N}(ST)$. Hence $t\in\mathscr{N}(S)$. $\blacksquare$

(26) Suppose $D\in\lnmpsb\big(\mathscr{P}(\mathbb{R}),\mathscr{P}(\mathbb{R})\big)$ is such that $\deg{Dp}=(\deg{p})-1$ for every nonconstant polynomial $p\in\mathscr{P}(\mathbb{R})$. Then $D$ is surjective.

The notation $D$ is used above to remind you of the differentiation map that sends a polynomial $p$ to $p’$. Without knowing the formula for the derivative of a polynomial (except that it reduces the degree by $1$), you can use the exercise above to show that for every polynomial $q\in\mathscr{P}(\mathbb{R})$, there exists a polynomial $p\in\mathscr{P}(\mathbb{R})$ such that $p’=q$.

Proof The definition 2.5, p.29 for span is the set of all linear combinations on a list. A list is finite by definition, p.5. So how do we define span for infinite dimensional vector spaces? I found this definition online, seems reasonable for now:

Definition Let $V$ be an infinite dimensional vector space. We say the sequence $v_1,v_2,…$ spans $V$ if each vector $v\in V$ is a finite linear combination $\phi_1v_1+\dotsb+\phi_nv_n$ of the $v_i$’s for some $n$. We also denote this as $\text{span}(v_1,v_2,…)=V$.

Note that if $\text{span}(v_1,…,v_n)=\text{span}(w_1,…,w_n)$ for every $n\in\mathbb{N}$, then $\text{span}(v_1,v_2,…)=\text{span}(w_1,w_2,…)$. Proof: let $v\in\text{span}(v_1,v_2,…)$. Then for some $n$, we have

$$ v=\phi_1v_1+\dotsb+\phi_nv_n\in\text{span}(v_1,...,v_n)=\text{span}(w_1,...,w_n)\subset\text{span}(w_1,w_2,...) $$

Example 2.16, p.32 showed that $\mathscr{P}(\mathbb{R})$ is an infinite dimensional vector space. But each polynomial $p\in\mathscr{P}(\mathbb{R})$ is a finite linear combination $p=\sum_{j=0}^m\phi_jx^j$ of the polynomials $1,x,x^2,x^3,…,x^m$ for some $m$. Hence the sequence $1,x,x^2,…$ spans $\mathscr{P}(\mathbb{R})$.

Suppose $p_0,p_1,p_2,…,p_m\in\mathscr{P}(\mathbb{R})$ are such that each $p_j$ has degree $j$. In exercise 2.C.10, we proved that

$$ \text{span}(p_0,p_1,p_2,...,p_m)=\text{span}(1,x,x^2,...,x^m) $$

We can take $p_j\equiv Dx^{j+1}$ since the right side has degree $j+1-1=j$. Then for every $m=1,2,…$, we have

$$ \text{span}(Dx^1,Dx^2,Dx^3,...,Dx^{m+1})=\text{span}(1,x,x^2,...,x^m) $$

and it follows that

$$ \text{span}(Dx^1,Dx^2,Dx^3,...)=\text{span}(1,x,x^2,...)=\mathscr{P}(\mathbb{R}) $$

Let $q\in\text{span}(Dx^1,Dx^2,Dx^3,…)$. Then for some $n$, we have $q=\sum_{j=0}^n\psi_jDx^j=D\big(\sum_{j=0}^n\psi_jx^j\big)$. Hence $q\in\mathscr{R}(D)$ and $\text{span}(Dx^1,Dx^2,Dx^3,…)\subset\mathscr{R}(D)$. Hence

$$ \mathscr{P}(\mathbb{R})=\text{span}(Dx^1,Dx^2,Dx^3,...)\subset\mathscr{R}(D)\subset\mathscr{P}(\mathbb{R}) $$

and we have $\mathscr{P}(\mathbb{R})=\mathscr{R}(D)$. $\blacksquare$

(27) Suppose $p\in\mathscr{P}(\mathbb{R})$. Then there exists a polynomial $q\in\mathscr{P}(\mathbb{R})$ such that $5q^{\prime\prime}+3q^\prime=p$.

This exercise can be done without linear algebra, but it’s more fun to do it using linear algebra.

Proof Let $D$ be the differentiation operator as defined in example 3.4, p.53. For any $a\in\mathscr{P}(\mathbb{R})$ with $\deg{a}\neq0$ (nonconstant), we have

$$ Da=D\Big(\sum_{i=0}^{\deg{a}}c_ix^i\Big)=\sum_{i=0}^{\deg{a}}c_iDx^i=\sum_{i=0}^{\deg{a}}c_i(x^i)^\prime=\sum_{i=1}^{\deg{a}}ic_ix^{i-1} $$

Hence $\deg{Da}=(\deg{a})-1$ for every nonconstant polynomial $a\in\mathscr{P}(\mathbb{R})$. Hence $D$ satisfies the hypothesis from problem (26).

From definition 3.8, p.55 of the product (composite) of linear maps, we can easily compute that the product of two linear maps is linear. Hence $5D^2+3D\in\lnmpsb\big(\mathscr{P}(\mathbb{R},\mathscr{P}(\mathbb{R}\big)$. And we can show that this map also satisfies the hypothesis from problem (26):

$$\begin{align*} (5D^2+3D)a&=(5D^2+3D)\Big(\sum_{i=0}^{\deg{a}}c_ix^i\Big) \\ &=\sum_{i=0}^{\deg{a}}c_i(5D^2+3D)x^i \\ &=\sum_{i=0}^{\deg{a}}c_i\big[5D^2(x^i)+3D(x^i)\big] \\ &=\sum_{i=0}^{\deg{a}}c_i\big[5(x^i)^{\prime\prime}+3(x^i)^{\prime}\big] \\ &=\sum_{i=0}^{\deg{a}}c_i5(x^i)^{\prime\prime}+\sum_{i=0}^{\deg{a}}c_i3(x^i)^{\prime} \\ &=\sum_{i=2}^{\deg{a}}i(i-1)c_i5x^{i-2}+\sum_{i=1}^{\deg{a}}ic_i3x^{i-1} \end{align*}$$

Hence $\deg{(5D^2+3D)a}=(\deg{a})-1$ for every nonconstant polynomial $a\in\mathscr{P}(\mathbb{R})$. Hence $D$ satisfies the hypothesis from problem (26). In problem (26), we proved that such maps are surjective. Hence there exists $q\in\mathscr{P}(\mathbb{R})$ such that

$$ p=(5D^2+3D)q=5D^2q+3Dq=5q^{\prime\prime}+3q^\prime\quad\blacksquare $$

Exercises 3.C

(1) Suppose $V$ and $W$ are finite-dimensional and $T\in\lnmpsb(V,W)$. Then, with respect to each choice of bases of $V$ and $W$, the matrix of $T$ has at least $\dim{\mathscr{R}(T)}$ nonzero entries.

Proof Fix the bases $v_1,…,v_n$ for $V$ and $w_1,…,w_m$ for $W$. Suppose $\mtrxof{T}$ has less than $\dim{\mathscr{R}(T)}$ nonzero entries. Note that the number of nonzero columns in $\mtrxof{T}$ is less than or equal to the number of nonzero entries. Hence the number of nonzero columns in $\mtrxof{T}$ is less than $\dim{\mathscr{R}(T)}$.

Proposition W.3.14 gives that column $k$ is zero if and only if $Tv_k$ is zero. Hence the number of nonzero $Tv_1,…,Tv_n$ is less than $\dim{\mathscr{R}(T)}$. Let $N$ denote the number of nonzero $Tv_1,…,Tv_n$ so that $N<\dim{\mathscr{R}(T)}$.

Let $L$ denote the number of linearly independent vectors in $Tv_1,…,Tv_n$. Since zero vectors are linearly dependent, then $L\leq N$. Proposition W.3.13 gives that $L=\dim{\text{span}(Tv_1,…,Tv_n)}$. Hence

$$ \dim{\text{span}(Tv_1,...,Tv_n)}=L\leq N $$

But proposition W.3.10 gives that $\mathscr{R}(T)=\text{span}(Tv_1,…,Tv_n)$. Hence

$$ \dim{\mathscr{R}(T)}=\dim{\text{span}(Tv_1,...,Tv_n)}\leq N<\dim{\mathscr{R}(T)} $$

Contradiction. $\blacksquare$

(2) Suppose $D\in\lnmpsb\big(\mathscr{P}_3(\mathbb{R}),\mathscr{P}_2(\mathbb{R})\big)$ is the differentiation map defined by $Dp=p’$. Find a basis of $\mathscr{P}_3(\mathbb{R})$ and basis of $\mathscr{P}_2(\mathbb{R})$ such that the matrix of $D$ with respect to these bases is

$$ \begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\\\end{bmatrix} $$

Compare this with our detailed implementation of example 3.34 in our ch.3 notes. The next problem generalizes this.

Solution We want to find bases $v_1,v_2,v_3,v_4$ of $\mathscr{P}_3(\mathbb{R})$ and $w_1,w_2,w_3$ of $\mathscr{P}_2(\mathbb{R})$ such that

$$ \mtrxof{D}\equiv\begin{bmatrix}A_{1,1}&A_{1,2}&A_{1,3}&A_{1,4}\\A_{2,1}&A_{2,2}&A_{2,3}&A_{2,4}\\A_{3,1}&A_{3,2}&A_{3,3}&A_{3,4}\end{bmatrix}=\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\\\end{bmatrix} $$

and

$$\begin{matrix} Dv_1=\sum_{j=1}^3A_{j,1}w_j=1\cdot w_1+0\cdot w_2+0\cdot w_3=w_1 \\ Dv_2=\sum_{j=1}^3A_{j,2}w_j=0\cdot w_1+1\cdot w_2+0\cdot w_3=w_2 \\ Dv_3=\sum_{j=1}^3A_{j,3}w_j=0\cdot w_1+0\cdot w_2+1\cdot w_3=w_3 \\ Dv_4=\sum_{j=1}^3A_{j,4}w_j=0\cdot w_1+0\cdot w_2+0\cdot w_3=0 \end{matrix}$$

We first see that $v_4$ is a constant, zero degree polynomial since $Dv_4=0$. Set $v_4\equiv1$.

Since $w_1=Dv_1\in\mathscr{P}_2(\mathbb{R})$, then $w_1=Dv_1=\sum_{i=0}^2b_ix^i$. The only constraint is that $v_1$ be nonconstant so that $v_1,v_4$ is linearly independent. Set $v_1\equiv x$, the degree 1 polynomial. Then $w_1=Dv_1=1$.

Since $w_2=Dv_2\in\mathscr{P}_2(\mathbb{R})$, then $w_2=Dv_2=\sum_{i=0}^2c_ix^i$. The constraints are that $v_1,v_2,v_4$ be linearly independent and that $w_1,w_2$ be linearly independent. So we must pick $v_2$ such that $x,v_2,1$ is linearly independent and $1,Dv_2$ is linearly independent. Set $v_2\equiv x^2$ so that $x,x^2,1$ is linearly independent, $w_2=Dv_2=2x$, and $1,2x$ is linearly independent.

So far we have $v_1=x,v_2=x^2,v_4=1$ and $w_1=1,w_2=2x$. Hence we want to pick $v_3$ such $x,x^2,1,v_3$ is linearly independent and $1,2x,Dv_3$ is linearly independent. Set $v_3\equiv x^3$ so that $w_3=Dv_3=3x^2$.

$$ \begin{matrix} v_1=x\\v_2=x^2\\v_3=x^3\\v_4=1 \end{matrix} \quad\quad \begin{matrix} w_1=1\\w_2=2x\\w_3=3x^2 \end{matrix} $$

(3) Suppose $V$ and $W$ are finite-dimensional and $T\in\lnmpsb(V,W)$. Then there exists a basis of $V$ and a basis of $W$ such that, with respect to these bases, all entries of $\mtrxof{T}$ are $0$ except that the entries in row $j$, column $j$, equal $1$ for $1\leq j\leq\dim{\mathscr{R}(T)}$.

Proof Similar to the proof of the Fundamental Theorem of Linear Maps (3.22, p.63), let $v_{n+1},…,v_{n+m}$ be a basis of $\mathscr{N}(T)$. This is a subspace of finite-dimensional $V$, so such a basis exists. The linearly independent list $v_{n+1},…,v_{n+m}$ can be extended to a basis

$$ v_{n+1},...,v_{n+m},v_1,...,v_n $$

of $V$, by proposition 2.33, p.41. Axler goes on to prove that $Tv_1,…,Tv_n$ is a basis of $\mathscr{R}(T)$. This implies that $n=\dim{\mathscr{R}(T)}$. This also implies that we can extend this to a basis for $W$:

$$ Tv_1,...,Tv_n,\phi_{n+1},...,\phi_{n+p} $$

Note that $0=Tv_k$ for $k=n+1,…,n+m$ since $v_k\in\mathscr{N}(T)$ for $k=n+1,…,n+m$. Also note that $Tv_k\in W$ hence $Tv_k$ has a unique representation in $W$ for the basis we just constructed:

$$ 0=Tv_k=\sum_{j=1}^{n}A_{j,k}Tv_j+\sum_{j=n+1}^{n+p}A_{j,k}\phi_j $$

We can prove in two different ways that $A_{j,k}=0$ for $j=1,…,n+p$ and $k=n+1,…,n+m$. The Criterion for a Basis (2.29, p.39) gives that these $A_{j,k}$’s are unique. And obviously $0=A_{j,k}$ is a solution. Alternatively, we can recognize that the constructed basis for $W$ is linearly independent hence $0=A_{j,k}$ for $j=1,…,n+p$ and $k=n+1,…,n+m$.

So we have shown that the last $m=\dim{\mathscr{N}(T)}$ columns of $\mtrxof{T}$ are zero.

Next we want to show that $A_{j,k}=0$ for $j=1,…,n+p$ and $k=1,…,n$ except that $A_{j,j}=1$ for $j=1,…n=\mathscr{R}(T)$. We see that this is indeed a solution to

$$ Tv_k=\sum_{j=1}^nA_{j,k}Tv_j+\sum_{j=n+1}^{n+p}A_{j,k}\phi_j $$

The Criterion for a Basis gives that the $A_{j,k}$ are unique. Hence the first $n$ columns of $\mtrxof{T}$ are zero except for the $1$’s in entries $A_{j,j}$. $\blacksquare$

(6) Suppose $V$ and $W$ are finite-dimensional and $T\in\lnmpsb(V,W)$. Then $\dim{\mathscr{R}(T)}=1$ if and only if there exist a basis of $V$ and a basis of $W$ such that, with respect to these bases, all entries of $\mtrxof{T}$ equal $1$.

Proof Suppose there exist a basis $v_1,…,v_n$ of $V$ and a basis $w_1,…,w_m$ of $W$ such that, with respect to these bases, all entries of $\mtrxof{T}$ equal $1$. That is $A_{j,k}=1$ for $k=1,…,n$ and $j=1,…,m$. Then, for $k=1,…,n$, we have

$$ Tv_k=\sum_{j=1}^mA_{j,k}w_j=\sum_{j=1}^mw_j\neq0 $$

That is $Tv_1=Tv_2=\dotsb=Tv_n=\sum_{j=1}^mw_j$. Proposition W.3.10 gives that $Tv_1,…,Tv_n$ spans $\mathscr{R}(T)$ hence

$$ \mathscr{R}(T)=\text{span}(Tv_1,...,Tv_n)=\text{span}(Tv_1)=\text{span}\Big(\sum_{j=1}^mw_j\Big) $$

Since $w_1,…,w_n$ is linearly independent, then $0\neq\sum_{j=1}^mw_j$ and $\mathscr{R}(T)$ is the span of one nonzero vector. Hence $\dim{\mathscr{R}(T)}=1$.

Conversely, suppose $\dim{\mathscr{R}(T)}=1$. Let $\phi_1,…,\phi_n$ be a basis of $V$ such that $\phi_2,…,\phi_n\in\mathscr{N}(T)$ and $T\phi_1\neq0$. Note that such a basis exists because

$$ \dim{\mathscr{N}(T)}=\dim{V}-\dim{\mathscr{R}(T)}=n-1 $$

Hence $\mathscr{N}(T)$ can contain the linearly independent list $\phi_2,…,\phi_n$ of length $n-1$. Also, proposition W.3.10 gives that

$$ \mathscr{R}(T)=\text{span}(T\phi_1,...,T\phi_n)=\text{span}(T\phi_1) $$

This agrees with our supposition that $\dim{\mathscr{R}(T)}=1$ since $0\neq T\phi_1$ so that $T\phi_1$ is a basis of $\mathscr{R}(T)$.

Since $T\phi_1$ is a basis of $\mathscr{R}(T)$, we can extend it to a basis $T\phi_1,w_2,…,w_m$ of $W$. Define $w_1\equiv T\phi_1-\sum_{j=2}^mw_j$. Proposition W.2.18 gives that $w_1,…,w_m$ is a basis of $W$.

Also define

$$ v_1=\phi_1\quad\quad\text{and}\quad\quad v_k=\phi_k+\phi_1\quad\text{for }k=2,...,n $$

Again proposition W.2.18 gives that $v_1,…,v_n$ is a basis of $V$. And we see that

$$ Tv_1=T\phi_1=w_1+\sum_{j=2}^mw_j=\sum_{j=1}^mw_j $$

and

$$ Tv_k=T\phi_k+T\phi_1=0+\sum_{j=1}^mw_j\quad\text{for }k=2,...,n $$

Hence the entries $A_{j,k}$ of $\mtrxof{T}$ are all $1$. $\blacksquare$

(7) Suppose $S,T\in\lnmpsb(V,W)$. Then $\mtrxof{S+T}=\mtrxof{S}+\mtrxof{T}$.

Proof Let $[A_{j,k}]_{j=1,k=1}^{m,n}$ be $\mtrxof{S}$ and let $[C_{j,k}]_{j=1,k=1}^{m,n}$ be $\mtrxof{T}$. By definition 3.35, $\mtrxof{S+T}=[A_{j,k}+C_{j,k}]_{j=1,k=1}^{m,n}$. We need to verify that the definition of the matrix of a linear map (3.32, p.70) is satisfied. That is, the entries $A_{j,k}+C_{j,k}$ should be the coefficients $B_{j,k}$ in

$$ (S+T)v_k=\sum_{j=1}^mB_{j,k}w_j $$

Note that $[A_{j,k}]_{j=1,k=1}^{m,n}=\mtrxof{S}$ gives

$$ Sv_k=\sum_{j=1}^mA_{j,k}w_j $$

And note that $[C_{j,k}]_{j=1,k=1}^{m,n}=\mtrxof{T}$ gives

$$ Tv_k=\sum_{j=1}^mC_{j,k}w_j $$

Hence

$$ (S+T)v_k=Sv_k+Tv_k=\sum_{j=1}^mA_{j,k}w_j+\sum_{j=1}^mC_{j,k}w_j=\sum_{j=1}^m(A_{j,k}+C_{j,k})w_j\quad\blacksquare $$

(9) Suppose $A$ is an $m\times n$ matrix and $c$ is an $n\times1$ matrix:

$$ A\equiv\begin{bmatrix}A_{1,1}&A_{1,2}&\dots&A_{1,n}\\A_{2,1}&A_{2,2}&\dots&A_{2,n}\\\vdots&\vdots&\ddots&\vdots\\A_{m,1}&A_{m,2}&\dots&A_{m,n}\end{bmatrix} \quad\quad\quad\quad\quad c\equiv\begin{bmatrix}c_1\\\vdots\\c_n\end{bmatrix} $$

Then

$$ Ac=\sum_{k=1}^nc_kA_{:,k} $$

Proof Definition 3.41, p.75 of matrix multiplication gives that $Ac$ is an $m\times 1$ matrix with

$$ (Ac)_{j,1}=\sum_{k=1}^nA_{j,k}c_{k,1} $$

Hence

$$\begin{align*} Ac&=\begin{bmatrix}(Ac)_{1,1}\\\vdots\\(Ac)_{m,1}\end{bmatrix} \\\\ &=\begin{bmatrix}\sum_{k=1}^nA_{1,k}c_{k,1}\\\vdots\\\sum_{k=1}^nA_{m,k}c_{k,1}\end{bmatrix} \\\\ &=\begin{bmatrix}\sum_{k=1}^nc_{k}A_{1,k}\\\vdots\\\sum_{k=1}^nc_{k}A_{m,k}\end{bmatrix} \\\\ &=\sum_{k=1}^n\begin{bmatrix}c_{k}A_{1,k}\\\vdots\\c_{k}A_{m,k}\end{bmatrix} \\\\ &=\sum_{k=1}^nc_{k}\begin{bmatrix}A_{1,k}\\\vdots\\A_{m,k}\end{bmatrix} \\\\ &=\sum_{k=1}^nc_kA_{:,k}\quad\blacksquare \end{align*}$$

Exercises 3.D

(1) Suppose $T\in\lnmpsb(U,V)$ and $S\in\lnmpsb(V,W)$ are both invertible. Then $ST\in\lnmpsb(U,W)$ is invertible and $(ST)^{-1}=T^{-1}S^{-1}$.

Proof From definition 3.53, p.80, $ST$ is invertible if there exists $(ST)^{-1}\in\lnmpsb(W,U)$ such that $(ST)^{-1}ST=I\in\lnmpsb(U,U)$ and $ST(ST)^{-1}=I\in\lnmpsb(W,W)$. Set $(ST)^{-1}\equiv T^{-1}S^{-1}$. This makes sense since $S$ and $T$ are both invertible. Then

$$ (ST)(T^{-1}S^{-1})w=S(TT^{-1})S^{-1}w=SIS^{-1}w=SS^{-1}w=Iw=w $$

and

$$ (T^{-1}S^{-1})(ST)u=T^{-1}(S^{-1}S)Tu=T^{-1}ITu=T^{-1}Tu=Iu=u $$

Proposition 3.54, p.80 gives that $(ST)^{-1}=T^{-1}S^{-1}$ is unique. $\blacksquare$

(2) Suppose $V$ is finite-dimensional and $\dim{V}>1$. Then the set of noninvertible operators on $V$ is not a subspace of $\lnmpsb(V)$.

Proof Let $v_1,…,v_n$ be a basis of $V$ and define $S,T\in\lnmpsb(V)$ by

$$ Sv_k\equiv\begin{cases}v_1&k=1\\0&k=2,...,n\end{cases}\quad\quad\quad Tv_k\equiv\begin{cases}0&k=1\\v_k&k=2,...,n\end{cases} $$

Then $S$ is not injective because $Sv_2=0$ (this is where we use the hypothesis that $\dim{V}>1$) and $T$ is not injective because $Tv_1=0$. Hence proposition 3.69, p.87 gives that both $S$ and $T$ are not invertible and hence belong to the set of noninvertible operators on $V$. But $(S+T)v_k=v_k$ hence

$$ (S+T)v=(S+T)\Big(\sum_{k=1}^nc_kv_k\Big)=\sum_{k=1}^nc_k(S+T)v_k=\sum_{k=1}^nc_kv_k=v $$

Hence $S+T=I$ and $S+T$ is invertible. Hence the set of noninvertible operators on $V$ is not closed under addition and hence it’s not a subspace of $\lnmpsb(V)$. $\blacksquare$

Note If $\dim{V}=1$, then the set of noninvertible operators on $V$ equals $\{0\}$, which is a subspace of $V$.

(3) Suppose $V$ is finite-dimensional, $U$ is a subspace of $V$, and $S\in\lnmpsb(U,V)$. Then there exists an invertible operator $T\in\lnmpsb(V)$ such that $Tu=Su$ for every $u\in U$ if and only if $S$ is injective.

Proof Suppose $S$ is injective. Let $u_1,…,u_m$ be a basis of $U$ and extend it to a basis $u_1,…,u_m,v_{m+1},…,v_n$ of $V$. Proposition W.3.8 gives that $Su_1,…,Su_m$ is linearly independent in $V$. Hence we can extend it to a basis $Su_1,…,Su_m,w_{m+1},…,w_n$ of $V$. Proposition 3.5, p.54 gives the existence of $T\in\lnmpsb(V)$ such that

$$ Tu_k\equiv Su_k\quad\text{for }k=1,...,m\quad\quad\quad\quad Tv_k\equiv w_k\quad\text{for }k=m+1,...,n $$

Let $u=\sum_{k=1}^ma_ku_k\in U$. Then

$$ Tu=T\Big(\sum_{k=1}^ma_ku_k\Big)=\sum_{k=1}^ma_kTu_k=\sum_{k=1}^ma_kSu_k=S\Big(\sum_{k=1}^ma_ku_k\Big)=Su $$

Note that $u_1,…,u_m,v_{m+1},…,v_n$ is a basis of $V$ and hence spans it. And note that $Tu_1,…,Tu_m,Tv_{m+1},…,Tv_n$ is a basis of $V$ and hence spans it. Then proposition W.3.11 gives that $T$ is surjective. And proposition 3.69, p.87 gives that $T$ is invertible.

Conversely, suppose there exists an invertible operator $T\in\lnmpsb(V)$ such that $Tu=Su$ for every $u\in U$. Proposition 3.69 gives that $T$ is injective. Hence $S=T\bar_U$ is injective. $\blacksquare$

(4) Suppose $W$ is finite-dimensional and $T_1,T_2\in\lnmpsb(V,W)$. Then $\mathscr{N}(T_1)=\mathscr{N}(T_2)$ if and only if there exists an invertible operator $S\in\lnmpsb(W)$ such that $T_1=ST_2$.

Proof First suppose that $\mathscr{N}(T_1)=\mathscr{N}(T_2)$. Since $W$ is finite-dimensional, then so is $\mathscr{R}(T_2)\subset W$. Let $w_1,…,w_n$ be a basis of $\mathscr{R}(T_2)$. Since $w_1,…,w_n\in\mathscr{R}(T_2)$, then there exist $v_1,…,v_n\in V$ such that

$$ T_2v_k=w_k\quad\quad k=1,...,n $$

We want to show that $V=\mathscr{N}(T_2)\oplus\text{span}(v_1,…,v_n)$. Let $v\in V$. The Criterion for a Basis (p.39) gives that $T_2v\in\mathscr{R}(T_2)$ has a unique representation in terms of the basis $w_1,…,w_n$:

$$ T_2v=\sum_{k=1}^na_kw_k=\sum_{k=1}^na_kT_2v_k=T_2\Big(\sum_{k=1}^na_kv_k\Big) $$

Subtracting the right side from both sides, we get

$$ 0=T_2v-T_2\Big(\sum_{k=1}^na_kv_k\Big)=T_2\Big(v-\sum_{k=1}^na_kv_k\Big) $$

Hence

$$ v-\sum_{k=1}^na_kv_k\in\mathscr{N}(T_2) \dq\text{and}\dq \sum_{k=1}^na_kv_k\in\text{span}(v_1,...,v_n) $$

Now we see that the arbitrary vector $v\in V$ can be written as the sum of a vector from $\mathscr{N}(T_2)$ and a vector from $\text{span}(v_1,…,v_n)$:

$$ v=v-0=v-\sum_{k=1}^na_kv_k+\sum_{k=1}^na_kv_k $$

Hence $V=\mathscr{N}(T_2)+\text{span}(v_1,…,v_n)$. To show this is a direct sum, let $\phi\in\mathscr{N}(T_2)\cap\text{span}(v_1,…,v_n)$. Then $\phi\in\text{span}(v_1,…,v_n)$ and it can be written as linear combination $\phi=\sum_{k=1}^n\phi_kv_k$. And $\phi\in\mathscr{N}(T_2)$ so $\sum_{k=1}^n\phi_kv_k=\phi\in\mathscr{N}(T_2)$. Hence

$$ 0=T_2\phi=T_2\Big(\sum_{k=1}^n\phi_kv_k\Big)=\sum_{k=1}^n\phi_kT_2v_k=\sum_{k=1}^n\phi_kw_k $$

The linear independence of $w_1,…,w_n$ implies that $0=\phi_1=\dotsb=\phi_n$. Hence $0=\phi$ and $\{0\}=\mathscr{N}(T_2)\cap\text{span}(v_1,…,v_n)$. Hence

$$ V=\mathscr{N}(T_2)\oplus\text{span}(v_1,...,v_n) $$

Next let’s show that $T_1v_1,…,T_1v_n$ is linearly independent:

$$ 0=\sum_{k=1}^n\psi_kT_1v_k=T_1\Big(\sum_{k=1}^n\psi_kv_k\Big) $$

Hence $\sum_k^n\psi_kv_k\in\mathscr{N}(T_1)=\mathscr{N}(T_2)$:

$$ 0=T_2\Big(\sum_{k=1}^n\psi_kv_k\Big)=\sum_{k=1}^n\psi_kT_2v_k=\sum_{k=1}^n\psi_kw_k $$

The linear independence of $w_1,…,w_n$ implies that $0=\psi_1=\dotsb=\psi_n$. Hence $T_1v_1,…,T_1v_n$ is linearly independent.

Next, let’s extend $w_1,…,w_n$ to a basis $w_1,…,w_n,\alpha_1,…,\alpha_m$ of $W$. And let’s extend $T_1v_1,…,T_1v_n$ to a basis $T_1v_1,…,T_1v_n,\beta_1,…,\beta_m$ of $W$. Proposition 3.5, p.54 gives the existence of $S\in\lnmpsb(V)$ such that

$$ Sw_k\equiv T_1v_k\quad k=1,...,n\quad\quad\quad\quad S\alpha_k\equiv\beta_k\quad k=1,...,m $$

Since $S$ maps an arbitrary spanning list of the domain to a spanning list of the codomain, proposition W.3.11 gives that $S$ is surjective. Then proposition 3.69, p.87 gives that $S$ is invertible.

Let $\epsilon\in V=\mathscr{N}(T_2)\oplus\text{span}(v_1,…,v_n)$. Then

$$ \epsilon=\epsilon_{\mathscr{N}}+\sum_{k=1}^n\epsilon_kv_k $$

where $\epsilon_{\mathscr{N}}\in\mathscr{N}(T_2)=\mathscr{N}(T_1)$ and $\epsilon_k\in\mathbb{F}$. Then

$$\begin{align*} (ST_2)\epsilon &= (ST_2)\Big(\epsilon_{\mathscr{N}}+\sum_{k=1}^n\epsilon_kv_k\Big) \\ &= (ST_2)(\epsilon_{\mathscr{N}}) + (ST_2)\Big(\sum_{k=1}^n\epsilon_kv_k\Big) \\ &= S\big(T_2(\epsilon_{\mathscr{N}})\big) + S\Big[T_2\Big(\sum_{k=1}^n\epsilon_kv_k\Big)\Big] \\ &= S(0) + S\Big(\sum_{k=1}^n\epsilon_kT_2v_k\Big) \\ &= S\Big(\sum_{k=1}^n\epsilon_kw_k\Big) \\ &= \sum_{k=1}^n\epsilon_kSw_k \\ &= \sum_{k=1}^n\epsilon_kT_1v_k \\ &= T_1\Big(\sum_{k=1}^n\epsilon_kv_k\Big) \\ &= 0+T_1\Big(\sum_{k=1}^n\epsilon_kv_k\Big) \\ &= T_1(\epsilon_{\mathscr{N}})+T_1\Big(\sum_{k=1}^n\epsilon_kv_k\Big) \\ &= T_1\Big(\epsilon_{\mathscr{N}}+\sum_{k=1}^n\epsilon_kv_k\Big) \\ &= T_1(\epsilon) \end{align*}$$

Conversely, suppose there exists an invertible operator $S\in\lnmpsb(W)$ such that $ST_2=T_1$. Then for any $\mu\in\mathscr{N}(T_1)$, we have

$$ 0=T_1\mu=(ST_2)\mu=S(T_2\mu) $$

The invertibility of $S$ implies that $\mathscr{N}(S)=\{0\}$. Hence $T_2\mu=0$. Hence $\mu\in\mathscr{N}(T_2)$ and $\mathscr{N}(T_1)\subset\mathscr{N}(T_2)$.

In the other direction, note that the invertibility of $S$ gives

$$ S^{-1}T_1=S^{-1}ST_2=T_2 $$

Let $\mu\in\mathscr{N}(T_2)$. Then

$$ 0=T_2\mu=(S^{-1}T_1)\mu=S^{-1}(T_1\mu) $$

The invertibility of $S^{-1}$ implies that $\mathscr{N}(S^{-1})=\{0\}$. Hence $T_1\mu=0$. Hence $\mu\in\mathscr{N}(T_1)$ and $\mathscr{N}(T_2)\subset\mathscr{N}(T_1)$. $\blacksquare$

(5) Suppose $V$ is finite-dimensional and $T_1,T_2\in\lnmpsb(V,W)$. Then $\mathscr{R}(T_1)=\mathscr{R}(T_2)$ if and only if there exists an invertible operator $S\in\lnmpsb(V)$ such that $T_1=T_2S$.

Proof Suppose $\mathscr{R}(T_1)=\mathscr{R}(T_2)$. Since $V$ is finite-dimensional, so is the subspace $\mathscr{N}(T_1)$. Let $\phi_1,…,\phi_m$ be a basis of $\mathscr{N}(T_1)$ and extend it to a basis $\phi_1,…,\phi_m,u_1,…,u_n$ of $V$. Hence $\dim{V}=m+n$ and $\dim{N}(T_1)=m$. Proposition W.3.10 gives that

$$ \mathscr{R}(T_1)=\text{span}(T_1\phi_1,...,T_1\phi_m,T_1u_1,...,T_1u_n)=\text{span}(T_1u_1,...,T_1u_n) $$

The last equality holds because $\phi_k\in\mathscr{N}(T_1)$ hence $T\phi_k=0$ for $k=1,…,n$. Hence the Linear Dependence Lemma, p.34 gives that we can remove the $T_1\phi_1,…,T_1\phi_n$ from the full list and not change the span.

The proof of the Fundamental Theorem of Linear Maps, p.63 goes onto show that $T_1u_1,…,T_1u_n$ is linearly independent.

Since $T_1u_k\in\mathscr{R}(T_1)=\mathscr{R}(T_2)$, then there exist $v_1,…,v_n\in V$ such that $T_2v_k=T_1u_k$ for $k=1,…,n$. We proved in problem (4) of Exercise 3.A that $v_1,…,v_n$ is linearly independent.

Note that $\mathscr{R}(T_1)=\mathscr{R}(T_2)$ implies that $\dim{\mathscr{N}(T_2)}=\dim{\mathscr{N}(T_1)}=m$. Let $\psi_1,…,\psi_m$ be a basis for $\mathscr{N}(T_2)$ and let’s show that $\psi_1,…,\psi_m,v_1,…,v_n$ is linearly independent:

$$ 0=\sum_{k=1}^m\alpha_k\psi_k+\sum_{k=1}^n\beta_kv_k $$

Applying $T_2$ to both sides, we get

$$\begin{align*} 0=T_2(0)&=T_2\Big(\sum_{k=1}^m\alpha_k\psi_k+\sum_{k=1}^n\beta_kv_k\Big) \\ &=\sum_{k=1}^m\alpha_kT_2\psi_k+\sum_{k=1}^n\beta_kT_2v_k \\ &=0+\sum_{k=1}^n\beta_kT_1u_k \end{align*}$$

The first equality follows because a linear map maps $0$ to $0$, p.57. The last equality follows because $\psi_k\in\mathscr{N}(T_2)$ and because $T_2v_k=T_1u_k$.

Then the linear independence of $T_1u_1,…,T_1u_n$ implies that $0=\beta_1=\dotsb=\beta_n$. Hence $0=\sum_k^n\beta_kv_k$ and then the linear independence of $\psi_1,…,\psi_m$ implies that $0=\alpha_1=\dotsb=\alpha_m$. Hence $\psi_1,…,\psi_m,v_1,…,v_n$ is indeed linearly independent.

Note that $\text{len}(\psi_1,…,\psi_m,v_1,…,v_n)=m+n=\dim{V}$. Since a linearly independent list of the right length is a basis (p.45), this list is a basis of $V$.

Proposition 3.5, p.54 gives the existence of the unique $S\in\lnmpsb(V)$ such that

$$ S\phi_k\equiv\psi_k\quad k=1,...,m\quad\quad\quad\quad Su_k\equiv v_k\quad k=1,...,n $$

Then

$$ T_1\phi_k=0=T_2\psi_k=(T_2S)\phi_k\quad k=1,...,m\quad\quad\quad\quad\quad T_1u_k=T_2v_k=(T_2S)u_k\quad k=1,...,n $$

Then proposition 3.5 gives that $T_2S\in\lnmpsb(V,W)$. Let $v\in V$. Then

$$\begin{align*} T_1v&=T_1\Big(\sum_{k=1}^ma_k\phi_k+\sum_{k=1}^nb_ku_k\Big) \\ &=\sum_{k=1}^ma_kT_1\phi_k+\sum_{k=1}^nb_kT_1u_k \\ &=\sum_{k=1}^ma_kT_2S\phi_k+\sum_{k=1}^nb_kT_2Su_k \\ &=T_2S\Big(\sum_{k=1}^ma_k\phi_k+\sum_{k=1}^nb_ku_k\Big) \\ &= T_2Sv \end{align*}$$

Since $S$ maps an arbitrary spanning list of $V$ to a spanning list of $V$, then proposition W.3.11 gives that $S$ is surjective. Then proposition 3.69, p.87 gives that $S$ is invertible.

Conversely, suppose that there exists an invertible operator $S\lnmpsb(V)$ such that $T_1=T_2S$. Let $\mu\in\mathscr{R}(T_1)$. Then there exists $v\in V$ such that $T_1v=\mu$. Hence

$$ \mu=T_1v=T_2Sv\in\mathscr{R}(T_2) $$

Hence $\mathscr{R}(T_1)\subset\mathscr{R}(T_2)$.

Since $S$ is invertible, then $T_2=T_2SS^{-1}=T_1S^{-1}$. Let $\gamma\in\mathscr{R}(T_2)$. Then there exists $u\in V$ such that $T_2u=\gamma$. Hence

$$ \gamma=T_2u=T_1S^{-1}u\in\mathscr{R}(T_1) $$

Hence $\mathscr{R}(T_2)\subset\mathscr{R}(T_1)$. $\blacksquare$

(8) Suppose $V$ is finite-dimensional $T\in\lnmpsb(V,W)$ is surjective. Then there exists a subspace $U$ of $V$ such that $T\bar_U$ is an isomorphism onto $W$.

$T\bar_U$ means the function $T$ restricted to $U$. In other words, $T\bar_U$ is the function whose domain is $U$, with $T\bar_U(u)=Tu$ for all $u\in U$.

Proof Let $w_1,…,w_n$ be a basis of $W$. Since $T$ is surjective, there exist $v_1,…,v_n\in V$ such that $Tv_i=w_i$. Problem (4) from Exer 3.B gives that $v_1,…,v_n$ is linearly independent. Define $U\equiv\text{span}(v_1,…,v_n)$. Then $T\bar_U$ maps a basis to a basis and proposition W.3.17 gives that $T\bar_U$ is an isomorphism. $\blacksquare$

(9) Suppose $V$ is finite-dimensional and $S,T\in\lnmpsb(V)$. Then $ST$ is invertible if and only if both $S$ and $T$ are invertible.

Proof Suppose $ST$ is invertible. Then there exists $R\in\lnmpsb(V)$ such that $R(ST)=(ST)R=I$.

Let $v\in\mathscr{N}(T)\subset V$. Then

$$ v=Iv=R(ST)v=(RS)Tv=(RS)0=0 $$

Hence $\mathscr{N}(T)=\{0\}$ and $T$ is injective. Then proposition 3.69 gives that $T$ is invertible.

Let $u\in V$. Then

$$ u=Iu=(ST)Ru=S(TR)u\in\mathscr{R}(S) $$

Hence $V\subset\rangsp{S}$, hence $V=\rangsp{S}$, and hence $S$ is surjective. Then proposition 3.69 gives that $S$ in invertible.

Conversely, suppose $S$ and $T$ are both invertible. Then

$$ (ST)(T^{-1}S^{-1})=S(TT^{-1})S^{-1}=SS^{-1}=I $$

and

$$ (T^{-1}S^{-1})(ST)=T^{-1}(S^{-1}S)T=T^{-1}T=I $$

Hence $T^{-1}S^{-1}$ satisfies the properties required for an inverse of $ST$. Hence $ST$ is invertible and $(ST)^{-1}=T^{-1}S^{-1}$. $\blacksquare$

(10) Suppose $V$ is finite-dimensional and $S,T\in\lnmpsb(V)$. Then $ST=I$ if and only if $TS=I$.

Proof Suppose $ST=I$. Note that $I$ is invertible ($I^{-1}=I$ because $II=I$). Hence $ST$ is invertible. Then the previous exercise implies that both $S$ and $T$ are invertible. Multiplying both sides of $ST=I$ by $T^{-1}$ on the right, we get

$$ S=STT^{-1}=IT^{-1}=T^{-1} $$

Now multiply both sides of this equation by $T$ on the left:

$$ TS=TT^{-1}=I $$

Conversely, we can simply reverse the roles of $S$ and $T$ and walk through the same steps. $\blacksquare$

(11) Suppose $V$ is finite-dimensional and $S,T,U\in\lnmpsb(V)$ and $STU=I$. Then $T$ is invertible and $T^{-1}=US$.

Proof By problem (9), we have that $TU$ and $S$ are both invertible. Again by problem (9), we have $T$ is invertible. Multiplying both sides of $STU=I$ by $S^{-1}$ on the left, we get

$$ TU=S^{-1}STU=STU=S^{-1}I=S^{-1} $$

Then multiplying both sides of this by $S$ on the right, we get

$$ TUS=S^{-1}S=I $$

And multiplying both sides of this by $T^{-1}$ on the left, we get

$$ US=T^{-1}TUS=T^{-1}I=T^{-1}\quad\blacksquare $$

(12) The result in the previous exercise can fail without the hypothesis that $V$ is finite-dimensional.

Counterexample Let $V\equiv\mathbb{C}^{\infty}$ and define

$$ T(z_1,z_2,z_3,...)\equiv(0,z_1,z_2,z_3,...)\quad\quad\text{Forward Shift}\\ S(z_1,z_2,z_3,...)\equiv(z_2,z_3,z_4,...)\quad\quad\text{Backward Shift}\\ U=I $$

Then for $z=(z_1,z_2,z_3,…)\in\mathbb{C}^\infty$, we have

$$ STUz=STU(z_1,z_2,z_3,...)=ST(z_1,z_2,z_3,...)=S(0,z_1,z_2,...)=(z_1,z_2,z_3,...)=z $$

Hence $STU=I$. But $T$ is not surjective: let $\phi\in\mathbb{C}^\infty$ with $\phi_1\neq0$. Then there exists no $\psi\in\mathbb{C}^\infty$ such that $T\psi=\phi$ since the first component of $T\psi$ is $0$ for any $\psi\in\mathbb{C}^\infty$. $\blacksquare$

(13) Suppose $V$ is finite-dimensional and $R,S,T\in\lnmpsb(V)$ are such that $RST$ is surjective. Then $S$ is injective.

Proof Proposition 3.69, p.87 gives that $RST$ is invertible. Then problem (9) gives that $R$ and $ST$ are both invertible. And again, problem (9) gives that $S$ and $T$ are both invertible. Then again, proposition 3.69 gives that $S$ is injective. $\blacksquare$

(14) Suppose $v_1,…,v_n$ is a basis of $V$. Then the map $T:V\mapsto\mathbb{F}^{n,1}$ defined by $Tv\equiv\mtrxof{v}$ is an isomorphism of $V$ onto $\mathbb{F}^{n,1}$; here $\mtrxof{v}$ is the matrix of $v\in V$ with respect to the basis $v_1,…,v_n$ (as defined in 3.62, p.84).

Proof First let’s show that $T$ is linear. Let $u,w\in V$ so that $u=\sum_k^na_kv_k$ and $w=\sum_k^nb_kv_k$. Then

$$ u+w=\sum_{k=1}^n(a_k+b_k)v_k $$

Hence the definition of the matrix of a vector (3.62, p.84) gives

$$\begin{align*} T(u+w)&=\mtrxof{u+w} \\ &=\mtrxofsb\Big(\sum_{k=1}^n(a_k+b_k)v_k\Big) \\ &=\begin{bmatrix}a_1+b_1\\\vdots\\a_n+b_n\end{bmatrix} \\ &=\begin{bmatrix}a_1\\\vdots\\a_n\end{bmatrix}+\begin{bmatrix}b_1\\\vdots\\b_n\end{bmatrix}\tag{3.35, matrix addition} \\ &=\mtrxofsb\Big(\sum_{k=1}^na_kv_k\Big)+\mtrxofsb\Big(\sum_{k=1}^nb_kv_k\Big) \\ &=\mtrxof{u}+\mtrxof{w}\\ &=Tu+Tw \end{align*}$$

Hence $T$ satisfies the additive property for linearity. The proof of homogeneity is similar. Hence $T$ is linear.

If $Tu=0\in\mathbb{F}^{n,1}$, then using again the definition of the matrix of a vector (3.62, p.84), we get

$$ \begin{bmatrix}0\\\vdots\\0\end{bmatrix}=0=Tu=\mtrxof{u}=\mtrxofsb\Big(\sum_{k=1}^na_kv_k\Big)=\begin{bmatrix}a_1\\\vdots\\a_n\end{bmatrix} $$

That is $0=a_1=\dotsb=a_n$ and $u=\sum_k^na_kv_k=0$. Hence $T$ is injective. Also

$$ \text{if}\quad c=\begin{bmatrix}c_1\\\vdots\\c_n\end{bmatrix}\in\mathbb{F}^{n,1}\quad\quad\text{then}\quad c=\begin{bmatrix}c_1\\\vdots\\c_n\end{bmatrix}=\mtrxofsb\Big(\sum_{k=1}^nc_kv_k\Big)=T\Big(\sum_{k=1}^nc_kv_k\Big)\in\mathscr{R}(T) $$

Hence $T$ is surjective. Then proposition 3.56, p.80 gives that $T$ is invertible. Hence the linear $T$ is isomorphic. $\blacksquare$

(15) Every linear map from $\mathbb{F}^{n,1}$ to $\mathbb{F}^{m,1}$ is given by a matrix multiplication. That is, if $T\in\lnmpsb\big(\mathbb{F}^{n,1},\mathbb{F}^{m,1}\big)$, then there exists an $m\times n$ matrix $A\in\mathbb{F}^{m,n}$ such that $Tx=Ax$ for every $x\in\mathbb{F}^{n,1}$.

Proof Let $v_1,\dots,v_n$ be the standard basis of $\mathbb{F}^{n,1}$ and let $w_1,\dots,w_m$ be the standard basis of $\mathbb{F}^{m,1}$ - as mentioned in the proof of 3.40, p.74. Note that if $x\in\mathbb{F}^{n,1}$, then $x=\sum_{k=1}^nx_kv_k$ is a linear combination of the standard basis there and

$$ x=\sum_{k=1}^nx_kv_k=x_1\begin{bmatrix}1\\0\\\vdots\\0\end{bmatrix}+x_2\begin{bmatrix}0\\1\\\vdots\\0\end{bmatrix}+\dots+x_n\begin{bmatrix}0\\0\\\vdots\\1\end{bmatrix} =\begin{bmatrix}x_1\\0\\\vdots\\0\end{bmatrix}+\begin{bmatrix}0\\x_2\\\vdots\\0\end{bmatrix}+\dots+\begin{bmatrix}0\\0\\\vdots\\x_n\end{bmatrix}=\begin{bmatrix}x_1\\x_2\\\vdots\\x_n\end{bmatrix} $$

Define $A\equiv\mtrxofsb\big(T,(v_1,\dots,v_n),(w_1,\dots,w_m)\big)$ so that the entries $A_{j,k}$ of $A$ are given by

$$ Tv_k=\sum_{j=1}^mA_{j,k}w_j=\begin{bmatrix}A_{1,k}\\0\\\vdots\\0\end{bmatrix}+\begin{bmatrix}0\\A_{2,k}\\\vdots\\0\end{bmatrix}+\dotsb+\begin{bmatrix}0\\0\\\vdots\\A_{m,k}\end{bmatrix}=\begin{bmatrix}A_{1,k}\\A_{2,k}\\\vdots\\A_{m,k}\end{bmatrix}=A_{:,k} $$

Such $A_{j,k}$’s exist because $Tv_k\in\mathbb{F}^{m,1}$ and $w_1,\dots,w_m$ is a basis for $\mathbb{F}^{m,1}$. Then for $x\in\mathbb{F}^{n,1}$, we get

$$ Tx=T\Big(\sum_{k=1}^nx_kv_k\Big)=\sum_{k=1}^nx_kTv_k=\sum_{k=1}^nx_kA_{:,k}= \begin{bmatrix}A_{1,1}&A_{1,2}&\dots&A_{1,n}\\A_{2,1}&A_{2,2}&\dots&A_{2,n}\\\vdots&\vdots&\ddots&\vdots\\A_{m,1}&A_{m,2}&\dots&A_{m,n}\end{bmatrix}\begin{bmatrix}x_1\\x_2\\\vdots\\x_n\end{bmatrix}=Ax $$

The next-to-last equality follows from proposition 3.52, p.77. We proved this proposition in exercise 3.C.9. $\blacksquare$

(16) Suppose $V$ is finite-dimensional and $T\in\lnmpsb(V)$. Then $T$ is a scalar multiple of the identity if and only if $ST=TS$ for every $S\in\lnmpsb(V)$.

Proof Suppose $T$ is a scalar, say $\lambda$, multiple of the identity. Let $S\in\lnmpsb(V)$ and $v\in V$. Then

$$ (ST)v=S(Tv)=S\big((\lambda I)v\big)=S\big(\lambda(Iv)\big)=S(\lambda v)=\lambda(Sv)=\lambda I(Sv)=T(Sv)=(TS)v $$

Conversely, suppose $ST=TS$ for every $S\in\lnmpsb(V)$. We will first prove that $v,Tv$ is linearly dependent for every $v\in V$. Fix $v\in V$ and suppose that $v,Tv$ is linearly independent. Then $v,Tv$ can be extended to a basis $v,Tv,u_1,…,u_n$ of $V$. Define $S\in\lnmpsb(V)$ by

$$ S(av+bTv+c_1u_1+\dotsb+c_nu_n)\equiv bv $$

In the proof of proposition 3.5, p.54, it’s proven that $S\in\lnmpsb(V)$. Then

$$ S(Tv)=S(0v+1\cdot Tv+0c_1+\dotsb+0c_n)=1v=v $$

and

$$ Sv=S(1v+0\cdot Tv+0c_1+\dotsb+0c_n)=0v=0 $$

Since $ST=TS$, we have

$$ v=S(Tv)=(ST)v=(TS)v=T(Sv)=T(0)=0 $$

This is a contradiction because $v,Tv$ is assumed to be linearly independent. Hence $v,Tv$ is linearly dependent for every $v\in V$. This implies that for each $v\in V$, the vector $Tv$ is a linear combination of $v$:

$$ Tv=a_vv $$

We put a subscript on $a_v$ because right now cannot assume that there exists a single $a\in\mathbb{F}$ such that $Tv=av$ for all $v\in V$. To show that $T$ is a scalar multiple of the identity, we must show that $a_v$ is independent of $v$. To do this, let $v,w\in V\setminus0$. We want to show that $a_v=a_w$.

First consider the case where $v,w$ is linearly dependent. Then there exists $b\in F$ such that $w=bv$. Hence

$$ a_ww=Tw=T(bv)=bTv=b(a_vv)=(ba_v)v=(a_vb)v=a_v(bv)=a_vw $$

Hence $a_v=a_w$. Now consider the case where $v,w$ is linearly independent. We have

$$ a_{v+w}v+a_{v+w}w=a_{v+w}(v+w)=T(v+w)=Tv+Tw=a_vv+a_ww $$

Subtracting the right side from both sides, we get

$$ 0=a_{v+w}v-a_vv+a_{v+w}w-a_ww=(a_{v+w}-a_v)v+(a_{v+w}-a_w)w $$

The linear independence of $v,w$ gives

$$ a_v=a_{v+w}=a_w\quad\blacksquare $$

(18) $V$ and $\lnmpsb(\mathbb{F},V)$ are isomorphic vector spaces.

Proof For $v\in V$, define $\varphi_v:\mathbb{F}\mapsto V$ by $\varphi_v(\lambda)=\lambda v$. Then $\varphi_v\in\lnmpsb(\mathbb{F},V)$:

$$ \varphi_v(\lambda+\beta)=(\lambda+\beta)v=\lambda v+\beta v=\varphi_v(\lambda)+\varphi_v(\beta) $$

$$ \varphi_v(\psi\lambda)=(\psi\lambda)v=\psi(\lambda v)=\psi\varphi_v(\lambda) $$

Define $\varphi:V\mapsto\lnmpsb(\mathbb{F},V)$ by $\varphi(v)=\varphi_v$. Then $\varphi\in\lnmpsb\big(V,\lnmpsb(\mathbb{F},V)\big)$:

$$ \varphi(u+v)(\lambda)=\varphi_{u+v}(\lambda)=\lambda(u+w)=\lambda u+\lambda v=\varphi_u(\lambda)+\varphi_v(\lambda)=\varphi(u)(\lambda)+\varphi(v)(\lambda) $$

$$ \varphi(\beta u)(\lambda)=\varphi_{\beta u}(\lambda)=\lambda(\beta u)=(\lambda\beta)u=(\beta\lambda)u=\beta(\lambda u)=\beta\varphi_{u}(\lambda) $$

And $\varphi$ is injective:

$$ \varphi(v)=0\implies\varphi_v=0\implies0=\varphi_v(\lambda)=\lambda v\quad\text{for all }\lambda\in\mathbb{F}\implies v=0 $$

And $\varphi$ is surjective: let $f\in\lnmpsb(\mathbb{F},V)$. Then for any $\lambda\in\mathbb{F}$, the linearity of $f$ gives

$$ f(\lambda)=f(\lambda\cdot1)=\lambda f(1)=\varphi_{f(1)}(\lambda)=\varphi\big(f(1)\big)(\lambda) $$

That is, for any $f\in\lnmpsb(\mathbb{F},V)$, there exists $f(1)\in V$ such that $\varphi\big(f(1)\big)=f$. Hence $f\in\mathscr{R}(\varphi)$.

The equivalence of invertibility with injectivity and surjectivity implies $\varphi$ is invertible. Hence there exists an isomorphism between $V$ and $\lnmpsb(\mathbb{F},V)$ and these two spaces are isomorphic. $\blacksquare$

(20) Suppose $n$ is a positive integer and $A_{j,k}\in\mathbb{F}$ for $j,k=1,\dots,n$. Then (A) and (B) are equivalent (note that in both parts below, the number of equations equals the number of variables):

(A) The trivial solution $0=x_1=\dotsb=x_n$ is the only solution to the homogeneous system of equations

$$\begin{matrix} \sum_{k=1}^nA_{1,k}x_k=0 \\ \vdots \\ \sum_{k=1}^nA_{n,k}x_k=0 \end{matrix}$$

(B) For every $c_1,\dots,c_n\in\mathbb{F}$, there exists a solution to the system equations

$$\begin{matrix} \sum_{k=1}^nA_{1,k}x_k=c_1 \\ \vdots \\ \sum_{k=1}^nA_{n,k}x_k=c_n \end{matrix}$$

Proof Define $T\in\lnmpsb(\mathbb{F}^n)$ by

$$ T(x_1,\dots,x_n)\equiv\Big(\sum_{k=1}^nA_{1,k}x_k,\dots,\sum_{k=1}^nA_{n,k}x_k\Big) $$

Then (A) is the assertion that $T$ is injective and (B) is the assertion that $T$ is surjective. Proposition 3.69, p.87 gives their equivalence. $\blacksquare$