Homework 3 Solutions

Contents

Section 1.5

Problem 1

a. If SS is a linearly dependent set, then each vector in SS is a linear combination of other vectors in SS. False: one possible counterexample is {v,0}\{v, 0\} with v0v\neq 0. The vectors {v,0}\{v,0\} are dependent, but vv is not a linear combination of 00.

b. Any set containing the zero vector is linearly dependent. True.

d. Subsets of linearly dependent sets are linearly dependent. False. Same counterexample as in (a): {v,0}\{v, 0\} is dependent, but {v}\{v\} is independent.

e. Subsets of linearly independent sets are linearly independent. True.

f. If a1x1+a2x2++anxn=0a_1 x_1 + a_2 x_2 + \cdots + a_n x_n = 0 and x1x_1, x2x_2 , … , xnx_n are linearly independent, then all the scalars aia_i are zero. True. This is the definition of linear independence.

Problem 2a

If M=(1324)M=\begin{pmatrix}1 &-3 \\ -2 &4 \end{pmatrix}, N=(2648)N=\begin{pmatrix} -2 & 6 \\4 & -8\end{pmatrix} then {M,N}\{M, N\} is dependent, because N=2MN=-2M, i.e. 2M+N=02M+N=0.

Problem 3

Suppose

a(110000)+b(001100)+c(000011)+d(101010)+e(010101)=(000000)a\begin{pmatrix} 1 & 1 \\ 0 & 0 \\ 0 & 0 \end{pmatrix} + b\begin{pmatrix} 0 & 0 \\ 1 & 1 \\ 0 & 0 \end{pmatrix} + c\begin{pmatrix} 0 & 0 \\ 0 & 0 \\ 1 & 1 \end{pmatrix} + d\begin{pmatrix} 1 & 0 \\ 1 & 0 \\ 1 & 0 \end{pmatrix} + e\begin{pmatrix} 0 & 1 \\ 0 & 1 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 &0 \\ 0 & 0 \end{pmatrix}

Then

(a+da+eb+db+ec+dc+e)=(000000)    a+d=0a+e=0b+d=0b+e=0c+d=0c+e=0\begin{pmatrix} a+d & a+e \\ b+d & b+e \\ c+d & c+e \end{pmatrix} = \begin{pmatrix} 0&0\\0&0\\0&0 \end{pmatrix}\implies \begin{aligned} a+d &=0 & a+e&=0 \\ b+d &=0 & b+e&=0 \\ c+d &=0 & c+e&=0 \end{aligned}

Solving this you find d=ed=e, a=b=ca=b=c, and a=da=-d. One solution is a=b=c=1a=b=c=1, and d=e=1,d=e=-1, i.e. we have

(110000)+(001100)+(000011)(101010)(010101)=(000000)\begin{pmatrix} 1 & 1 \\ 0 & 0 \\ 0 & 0 \end{pmatrix} + \begin{pmatrix} 0 & 0 \\ 1 & 1 \\ 0 & 0 \end{pmatrix} + \begin{pmatrix} 0 & 0 \\ 0 & 0 \\ 1 & 1 \end{pmatrix} - \begin{pmatrix} 1 & 0 \\ 1 & 0 \\ 1 & 0 \end{pmatrix} - \begin{pmatrix} 0 & 1 \\ 0 & 1 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 &0 \\ 0 & 0 \end{pmatrix}

which implies that the matrices are linearly dependent.

Problem 7

The set of diagonal matrices is

V={(a100a2)  a1,a2F}V = \left\{\begin{pmatrix} a_1 & 0 \\ 0 & a_2 \end{pmatrix} \Big| \; a_1, a_2\in\mathbb{F}\right\}

A basis for this vector space is given by {M1,M2}\{M_1, M_2\} where

M1=(1000),M2=(0001)M_1 = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \qquad M_2 = \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}

There are many other bases.

Problem 12: write a detailed proof.

We have to prove:

Theorem 1.6. Let VV be a vector space, and let S1S2VS_1 \subset S_2 \subset V. If S1S_1 is linearly dependent, then S2S_2 is linearly dependent.

Proof. Since S1S_1 is linearly dependent, there are vectors v1,,vkS1v_1, \dots, v_k\in S_1 and numbers a1,,akFa_1, \dots, a_k\in\mathbb{F}, at least one of which is not zero, and for which a1v1++akvk=0a_1v_1+\cdots+a_kv_k=0. We are given that S1S2S_1\subset S_2, so each of the vectors viv_i also belongs to S2S_2. Therefore we have a set of vectors v1,,vkS2v_1, \dots, v_k\in S_2 and the same set of numbers a1,,aka_1, \dots, a_k, one of which is not zero, with a1v1++akvk=0a_1v_1+\cdots+a_kv_k=0. This implies that S2S_2 is dependent. ////

Extra problem

If u,v,wVu,v,w∈V are given, and if p=u+vp=u+v, q=u+wq=u+w, r=v+wr=v+w, then express u,v,wu,v,w in terms of p,q,rp,q,r. Solution:

u+v=pu+w=qv+w=r    u+v=pv+w=p+qv+w=r    u+v=pv+w=p+q2w=p+q+r    \begin{alignedat}{4} &u&+v&&=p \\ &u&&+w&=q \\ &&v&+w&=r \end{alignedat}\implies \begin{alignedat}{4} &u&+v&&&=p \\ &&-v&+w&&=-p+q \\ &&v&+w&&=r \end{alignedat}\implies \begin{alignedat}{4} &u&+v&&&=p \\ &&-v&&+w&=-p+q \\ &&&&2w&=-p+q+r \end{alignedat}\implies

u+v=pv+w=p+qw=12p+12q+12r    u+v=pv      =12p+12q12rw=12p+12q+12r    u=12p+12q12rv=12p12q+12rw=12p+12q+12r\begin{alignedat}{4} &u&+v&&&=p \\ &&-v&&+w&=-p+q \\ &&&&w&=-\tfrac12p+\tfrac12q+\tfrac12r \end{alignedat}\implies \begin{alignedat}{4} &u&+v&&&=p \\ &&-v&&\;\;\;&=-\tfrac12p+\tfrac12q -\tfrac12r\\ &&&&w&=-\tfrac12p+\tfrac12q+\tfrac12r \end{alignedat}\implies \begin{alignedat}{4} &u&=\tfrac12p+\tfrac12q-\tfrac12r \\ &v&=\tfrac12p-\tfrac12q+\tfrac12r \\ &w&=-\tfrac12p+\tfrac12q+\tfrac12r \end{alignedat}

Problem 13b

We are given that {u,v,w}\{u,v,w\} are linearly independent. To see if {u+v,u+w,v+w}\{u+v, u+w, v+w\} is independent, assume that for a,b,cFa,b,c\in\mathbb{F} one has

a(u+v)+b(u+w)+c(v+w)=0.a(u+v)+b(u+w)+c(v+w)=0.

Then

(a+b)u+(a+c)v+(b+c)w=0.(a+b)u+(a+c)v+(b+c)w=0.

Independence of {u,v,w}\{u,v,w\} implies

a+b=a+c=b+c=0.a+b=a+c=b+c=0.

Solving these equations leads to a=b=c=0a=b=c=0. So the only linear combination of {u+v,u+w,v+w}\{u+v, u+w, v+w\} that adds up to zero is the trivial linear combination. Therefore {u+v,u+w,v+w}\{u+v, u+w, v+w\} is linearly independent.

Section 1.6

Problem 1

b. Every vector space that is generated by a finite set has a basis. True.

c. Every vector space has a finite basis. False. Example: P(R)\mathcal{P}(\R) does not have a finite basis.

d. A vector space cannot have more than one basis. False.

e. If a vector space has a finite basis, then the number of vectors in every basis is the same. True. This follows from the dimension theorem. The number of vectors in a basis is called the dimension of the vector space.

f. The dimension of Pn(F)\mathcal{P}_n(\mathbb{F}) is nn. False. {1,x,x2,,xn}\{1, x, x^2, \dots, x^n\} is a basis for Pn(F)\mathcal{P}_n(\mathbb{F}), so the dimension is n+1n+1.

g. The dimension of Mm×n(F)\mathcal{M}_{m×n} (\mathbb{F}) is m+nm + n. False. The dimension of Mm×n(F)\mathcal{M}_{m\times n}(\mathbb{F}) is mnmn.

Problem 4

Do the polynomials x32x2+1x^3 −2x^2 +1, 4x2x+34x^2 −x+3, and 3x23x−2 generate P3(R)\mathcal P_3 (\R)?

Answer: no. This follows from the dimension theorem: If the three vectors x32x2+1x^3 −2x^2 +1, 4x2x+34x^2 −x+3, and 3x23x−2 spanned Pn(R)\mathcal P_n(\R), then the four vectors {1,x,x2,x3}\{1, x, x^2, x^3\} would be linear combinations of x32x2+1x^3 −2x^2 +1, 4x2x+34x^2 −x+3, and 3x23x−2. By the dimension theorem {1,x,x2,x3}\{1, x, x^2, x^3\} would be linearly dependent. Since {1,x,x2,x3}\{1, x, x^2, x^3\} is independent we have a contradiction.

Problem 5

Is {p=(1,4,6),q=(1,5,8),r=(2,1,1),s=(0,1,0)}\{p=(1, 4, −6), q=(1, 5, 8), r=(2, 1, 1), s=(0, 1, 0)\} a linearly independent subset of R3\R^3? Answer: no. The four given vectors {p,q,r,s}\{p,q,r,s\} are linear combinations of the three standard basis vectors e1=(100)e_1=\left(\begin{smallmatrix} 1 \\ 0\\ 0\end{smallmatrix}\right), e2=(010)e_2=\left(\begin{smallmatrix} 0 \\ 1\\ 0\end{smallmatrix}\right), e3=(001)e_3=\left(\begin{smallmatrix} 0 \\ 0\\ 1\end{smallmatrix}\right). By the dimension theorem {p,q,r,s}\{p,q,r,s\} is linearly dependent.

Problem 7

Given:

u1=(231),u2=(142),u3=(8124),u4=(13717),u5=(358)u_1 = \left(\begin{smallmatrix} 2 \\ -3 \\1 \end{smallmatrix}\right),\quad u_2 = \left(\begin{smallmatrix} 1 \\ 4 \\ -2 \end{smallmatrix}\right),\quad u_3 = \left(\begin{smallmatrix} -8 \\ 12 \\-4 \end{smallmatrix}\right),\quad u_4 = \left(\begin{smallmatrix} 1 \\ 37 \\ -17 \end{smallmatrix}\right),\quad u_5 = \left(\begin{smallmatrix} 3 \\ -5 \\8 \end{smallmatrix}\right)

We have to choose three vectors from this collection and verify that they are independent.
Note that u3=4u1u_3=-4u_1, so {u1,u2,u3}\{u_1, u_2, u_3\} won't work. Let's try {u1,u2,u4}\{u_1, u_2, u_4\}. Suppose au1+bu2+cu4=0au_1+bu_2+cu_4=0. Then a,b,ca,b,c satisfy

2a+b+c=03a+4b+37c=0a2b17c=0    5b+35c=0b+21c=0a2b17c=0    14c=0b+21c=0a2b17c=0\begin{alignedat}{4} 2a&+b&&+c&=0\\ -3a&+4b&&+37c&=0 \\ a&-2b&&-17c&=0 \end{alignedat}\implies \begin{alignedat}{4} &&5b&&+35c&=0\\ &&b&&+21c&=0 \\ a&&-2b&&-17c&=0 \end{alignedat}\implies \begin{alignedat}{4} &&&&-14c&=0\\ &&b&&+21c&=0 \\ a&&-2b&&-17c&=0 \end{alignedat}

which implies a=b=c=0a=b=c=0. Hence {u1,u2,u4}\{u_1, u_2, u_4\} is a basis.

Problem 9

For any given a=(a1,a2,a3,a4)F4a = (a_1, a_2, a_3, a_4)\in\mathbb{F}^4 we are asked to find (c1,c2,c3,c4)(c_1, c_2, c_3, c_4) such that c1u1+c2u2+c3u3+c4u4=ac_1u_1+c_2u_2+c_3u_3+c_4u_4=a, i.e.

c1(1111)+c2(0111)+c3(0011)+c4(0001)=(a1a2a3a4).c_1\begin{pmatrix} 1 \\ 1\\ 1\\ 1 \end{pmatrix}+ c_2\begin{pmatrix} 0 \\ 1\\ 1\\ 1 \end{pmatrix}+ c_3\begin{pmatrix} 0 \\ 0\\ 1\\ 1 \end{pmatrix}+ c_4\begin{pmatrix} 0 \\ 0\\ 0\\ 1 \end{pmatrix}= \begin{pmatrix} a_1\\a_2\\a_3\\a_4 \end{pmatrix}.

This leads to the following linear equations:

c1=a1c1+c2=a2c1+c2+c3=a3c1+c2+c3+c4=a4    c1=a1c2=a2c1=a2a1c3=a3c1c2=a3a2c4=a4c1c2c3=a4a3    c1=a1c2=a2a1c3=a3a2c4=a4a3\begin{alignedat}{5} c_1 &&&&=a_1\\ c_1 &+c_2&&&=a_2\\ c_1 &+c_2&+c_3&&=a_3\\ c_1 &+c_2&+c_3&+c_4&=a_4 \end{alignedat}\implies \begin{alignedat}{2} c_1&=a_1&\\ c_2&=a_2-c_1=a_2-a_1\\ c_3&=a_3-c_1-c_2=a_3-a_2\\ c_4&=a_4-c_1-c_2-c_3=a_4-a_3 \end{alignedat}\implies \begin{alignedat}{2} c_1&=a_1&\\ c_2&=a_2-a_1\\ c_3&=a_3-a_2\\ c_4&=a_4-a_3 \end{alignedat}

Problem 12

Since {u,v,w}\{u,v,w\} is a basis for VV, we know that VV is three dimensional. To show that {u+v+w,v+w,w}\{u+v+w, v+w, w\} is a basis we have to show that {u+v+w,v+w,w}\{u+v+w, v+w, w\} is independent.

Suppose

a(u+v+w)+b(v+w)+cw=0.a(u+v+w)+b(v+w)+cw=0.

Then

(a+b+c)u+(b+c)v+cw=0.(a+b+c)u+(b+c)v+cw=0.

Since {u,v,w}\{u,v,w\} is independent, it follows that a+b+c=b+c=c=0a+b+c=b+c=c=0. This implies a=b=c=0a=b=c=0. Therefore {u+v+w,v+w,w}\{u+v+w, v+w, w\} is indeed linearly independent.

Problem 13

The problem states that the set of all (x1,x2,x3)R3(x_1, x_2, x_3)\in\R^3 that satisfy

x12x2+x3=02x13x2+x3=0\begin{alignedat}{4} x_1&-2x_2&+x_3 &= 0 \\ 2x_1&-3x_2&+x_3 &= 0 \end{alignedat}

is a linear subspace of R3\R^3. Solve the equations for x1,x2x_1, x_2 by row reduction:

x12x2+x3=02x13x2+x3=0    x12x2+x3=0x2x3=0    x1x3=0x2x3=0\begin{alignedat}{4} x_1&-2x_2&+x_3 &= 0 \\ 2x_1&-3x_2&+x_3 &= 0 \end{alignedat}\implies \begin{alignedat}{4} &x_1&-2x_2&+x_3 &= 0 \\ & & x_2&-x_3 &= 0 \end{alignedat}\implies \begin{alignedat}{4} &x_1& \qquad &-x_3 &= 0 \\ & & x_2&-x_3 &= 0 \end{alignedat}

We see that the solution set consists of all (x1x2x3)R3\tmat x_1\\x_2\\x_3\trix\in\R^3 that satisfy x1=x2=x3x_1=x_2=x_3. Hence the solution set is

{x3(111)    x3R}.\Bigl\{x_3\tmat1\\1\\1\trix \;\big|\; x_3\in\R\Bigr\}.

Therefore {(111)}\Bigl\{\tmat1\\1\\1\trix\Bigr\} is a basis for the solution space.

Problem 14

The vector space W1W_1 consists of all aF5a\in\mathbb{F}^5 that satisfy a1=a3+a4a_1=a_3+a_4. W1W_1 is a linear subspace of F5\mathbb F^5, so dimW15\dim W_1\leq 5. We also have W1F5W_1\neq \mathbb{F}^5, because e1=(1,0,0,0,0)F5e_1=(1, 0,0,0,0)\in\mathbb F^5, but e1∉W1e_1\not\in W_1. Therefore dimW14\dim W_1\leq 4. The vectors

(01000),(10100),(10010),(00001),\begin{pmatrix} 0\\1 \\0 \\0 \\0 \end{pmatrix},\quad \begin{pmatrix} 1\\0 \\1 \\0 \\0 \end{pmatrix},\quad \begin{pmatrix} 1\\0 \\0 \\1 \\0 \end{pmatrix},\quad \begin{pmatrix} 0\\0 \\0 \\0 \\1 \end{pmatrix},

all belong to W1W_1 and they are linearly independent. Therefore this is a basis for W1W_1 and dimW1=4\dim W_1=4.

The space W2W_2 consists of all aF5a\in\mathbb F^5 with a2=a3=a4a_2=a_3=a_4 and a1=a5a_1=-a_5.
It follows that each aW2a\in W_2 is given by

a=(a1a2a2a2a1)=a1(10001)+a2(01110)a = \begin{pmatrix} a_1 \\ a_2 \\ a_2 \\ a_2 \\ -a_1 \end{pmatrix} = a_1 \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \\ -1 \end{pmatrix} + a_2 \begin{pmatrix} 0 \\ 1 \\ 1 \\ 1 \\ 0 \end{pmatrix}

The vectors (10001),(01110)\left(\begin{smallmatrix} 1 \\ 0 \\ 0 \\ 0 \\ -1 \end{smallmatrix}\right) , \left(\begin{smallmatrix} 0 \\ 1 \\ 1 \\ 1 \\ 0 \end{smallmatrix}\right) are independent, and therefore they are a basis for W2W_2. It follows that W2W_2 is two dimensional.