24/09/2014

Definition and Examples of Linear Independence/Solutions

Definition and Examples of Linear Independence>>>Solutions





Problem 1
Decide whether each subset of  \mathbb{R}^3  is linearly dependent or linearly independent.
  1.  \{\begin{pmatrix} 1 \\ -3 \\ 5 \end{pmatrix},
\begin{pmatrix} 2 \\ 2 \\ 4 \end{pmatrix},
\begin{pmatrix} 4 \\ -4 \\ 14 \end{pmatrix} \}
  2.  \{\begin{pmatrix} 1 \\ 7 \\ 7 \end{pmatrix},
\begin{pmatrix} 2 \\ 7 \\ 7 \end{pmatrix},
\begin{pmatrix} 3 \\ 7 \\ 7 \end{pmatrix} \}
  3.  \{\begin{pmatrix} 0 \\ 0 \\ -1 \end{pmatrix},
\begin{pmatrix} 1 \\ 0 \\ 4 \end{pmatrix} \}
  4.  \{\begin{pmatrix} 9 \\ 9 \\ 0 \end{pmatrix},
\begin{pmatrix} 2 \\ 0 \\ 1 \end{pmatrix},
\begin{pmatrix} 3 \\ 5 \\ -4 \end{pmatrix},
\begin{pmatrix} 12 \\ 12 \\ -1 \end{pmatrix} \}
Answer
For each of these, when the subset is independent it must be proved, and when the subset is dependent an example of a dependence must be given.
  1. It is dependent. Considering
    
c_1\begin{pmatrix} 1 \\ -3 \\ 5 \end{pmatrix}
+c_2\begin{pmatrix} 2 \\ 2 \\ 4 \end{pmatrix}
+c_3\begin{pmatrix} 4 \\ -4 \\ 14 \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}
    gives rise to this linear system.
    
\begin{array}{*{3}{rc}r}
c_1 &+ &2c_2 &+ &4c_3 &= &0 \\
-3c_1&+ &2c_2 &- &4c_3 &= &0 \\
5c_1 &+ &4c_2 &+ &14c_3 &= &0
\end{array}
    Gauss' method
    
\left(\begin{array}{*{3}{c}|c}
1 &2 &4 &0 \\
-3 &2 &-4 &0 \\
5 &4 &14 &0
\end{array}\right)
\xrightarrow[-5\rho_1+\rho_3]{3\rho_1+\rho_2}
\;\xrightarrow[]{(3/4)\rho_2+\rho_3}
\left(\begin{array}{*{3}{c}|c}
1 &2 &4 &0 \\
0 &8 &8 &0 \\
0 &0 &0 &0
\end{array}\right)
    yields a free variable, so there are infinitely many solutions. For an example of a particular dependence we can set c_3 to be, say, 1. Then we get  c_2=-1  and  c_1=-2 .
  2. It is dependent. The linear system that arises here
    
\left(\begin{array}{*{3}{c}|c}
1 &2 &3 &0 \\
7 &7 &7 &0 \\
7 &7 &7 &0
\end{array}\right)
\;\xrightarrow[-7\rho_1+\rho_3]{-7\rho_1+\rho_2}
\;\xrightarrow[]{-\rho_2+\rho_3}\;
\left(\begin{array}{*{3}{c}|c}
1 &2 &3  &0 \\
0 &-7 &-14 &0 \\
0 &0 &0  &0
\end{array}\right)
    has infinitely many solutions. We can get a particular solution by taking c_3 to be, say, 1, and back-substituting to get the resulting c_2 and c_1.
  3. It is linearly independent. The system
    
\left(\begin{array}{*{2}{c}|c}
0 &1 &0 \\
0 &0 &0 \\
-1 &4 &0
\end{array}\right)
\;\xrightarrow[]{\rho_1\leftrightarrow\rho_2}
\;\xrightarrow[]{\rho_3\leftrightarrow\rho_1}\;
\left(\begin{array}{*{2}{c}|c}
-1 &4 &0 \\
0 &1 &0 \\
0 &0 &0
\end{array}\right)
    has only the solution c_1=0 and c_2=0. (We could also have gotten the answer by inspection— the second vector is obviously not a multiple of the first, and vice versa.)
  4. It is linearly dependent. The linear system
    
\left(\begin{array}{*{4}{c}|c}
9 &2 &3 &12 &0 \\
9 &0 &5 &12 &0 \\
0 &1 &-4 &-1 &0
\end{array}\right)
    has more unknowns than equations, and so Gauss' method must end with at least one variable free (there can't be a contradictory equation because the system is homogeneous, and so has at least the solution of all zeroes). To exhibit a combination, we can do the reduction
    
\xrightarrow[]{-\rho_1+\rho_2}
\;\xrightarrow[]{(1/2)\rho_2+\rho_3}\;
\left(\begin{array}{*{4}{c}|c}
9 &2 &3 &12 &0 \\
0 &-2 &2 &0  &0 \\
0 &0 &-3 &-1 &0
\end{array}\right)
    and take, say, c_4=1. Then we have that c_3=-1/3c_2=-1/3, and c_1=-31/27.
Problem 2
Which of these subsets of  \mathcal{P}_3  are linearly dependent and which are independent?
  1.  \{3-x+9x^2,5-6x+3x^2,1+1x-5x^2\}
  2.  \{-x^2,1+4x^2\}
  3.  \{2+x+7x^2,3-x+2x^2,4-3x^2\}
  4.  \{8+3x+3x^2,x+2x^2,2+2x+2x^2,8-2x+5x^2\}
Answer
In the cases of independence, that must be proved. Otherwise, a specific dependence must be produced. (Of course, dependences other than the ones exhibited here are possible.)
  1. This set is independent. Setting up the relation  c_1(3-x+9x^2)+c_2(5-6x+3x^2)+c_3(1+1x-5x^2)=0+0x+0x^2 gives a linear system
    
\left(\begin{array}{*{3}{c}|c}
3 &5 &1 &0 \\
-1 &-6 &1 &0 \\
9 &3 &-5 &0
\end{array}\right)
\;\xrightarrow[-3\rho_1+\rho_3]{(1/3)\rho_1+\rho_2}
\;\xrightarrow[]{3\rho_2}
\;\xrightarrow[]{-(12/13)\rho_2+\rho_3}\;
\left(\begin{array}{*{3}{c}|c}
3 &5  &1    &0 \\
0 &-13 &4    &0 \\
0 &0  &-128/13 &0
\end{array}\right)
    with only one solution:  c_1=0  c_2=0 , and  c_3=0 .
  2. This set is independent. We can see this by inspection, straight from the definition of linear independence. Obviously neither is a multiple of the other.
  3. This set is linearly independent. The linear system reduces in this way
    
\left(\begin{array}{*{3}{c}|c}
2 &3 &4 &0 \\
1 &-1 &0 &0 \\
7 &2 &-3 &0
\end{array}\right)
\;\xrightarrow[-(7/2)\rho_1+\rho_3]{-(1/2)\rho_1+\rho_2}
\;\xrightarrow[]{-(17/5)\rho_2+\rho_3}\;
\left(\begin{array}{*{3}{c}|c}
2 &3  &4   &0 \\
0 &-5/2 &-2   &0 \\
0 &0  &-51/5 &0
\end{array}\right)
    to show that there is only the solution c_1=0c_2=0, andc_3=0.
  4. This set is linearly dependent. The linear system
    
\left(\begin{array}{*{4}{c}|c}
8 &0 &2 &8 &0 \\
3 &1 &2 &-2 &0 \\
3 &2 &2 &5 &0
\end{array}\right)
    must, after reduction, end with at least one variable free (there are more variables than equations, and there is no possibility of a contradictory equation because the system is homogeneous). We can take the free variables as parameters to describe the solution set. We can then set the parameter to a nonzero value to get a nontrivial linear relation.

Problem 3
Prove that each set  \{f,g\}  is linearly independent in the vector space of all functions from  \mathbb{R}^+  to  \mathbb{R} .
  1.  f(x)=x  and  g(x)=1/x
  2.  f(x)=\cos(x)  and  g(x)=\sin(x)
  3.  f(x)=e^x  and  g(x)=\ln(x)
Answer
Let Z be the zero function Z(x)=0, which is the additive identity in the vector space under discussion.
  1. This set is linearly independent. Consider  c_1\cdot f(x)+c_2\cdot g(x)=Z(x) . Plugging in  x=1  and  x=2  gives a linear system
    
\begin{array}{*{2}{rc}r}
c_1\cdot 1 &+ &c_2\cdot 1   &= &0 \\
c_1\cdot 2 &+ &c_2\cdot (1/2) &= &0
\end{array}
    with the unique solution  c_1=0  c_2=0 .
  2. This set is linearly independent. Consider  c_1\cdot f(x)+c_2\cdot g(x)=Z(x)  and plug in  x=0  and  x=\pi/2  to get
    
\begin{array}{*{2}{rc}r}
c_1\cdot 1 &+ &c_2\cdot 0   &= &0 \\
c_1\cdot 0 &+ &c_2\cdot 1   &= &0
\end{array}
    which obviously gives that  c_1=0  c_2=0 .
  3. This set is also linearly independent. Considering  c_1\cdot f(x)+c_2\cdot g(x)=Z(x)  and plugging in  x=1 and  x=e
    
\begin{array}{*{2}{rc}r}
c_1\cdot e  &+ &c_2\cdot 0   &= &0 \\
c_1\cdot e^e &+ &c_2\cdot 1   &= &0
\end{array}
    gives that  c_1=0  and  c_2=0 .

Problem 4
Which of these subsets of the space of real-valued functions of one real variable is linearly dependent and which is linearly independent? (Note that we have abbreviated some constant functions; e.g., in the first item, the "2" stands for the constant function f(x)=2.)
  1.  \{2,4\sin^2(x),\cos^2(x)\}
  2.  \{1,\sin(x),\sin(2x)\}
  3.  \{x,\cos(x)\}
  4.  \{(1+x)^2,x^2+2x,3\}
  5.  \{\cos(2x),\sin^2(x),\cos^2(x)\}
  6.  \{0,x,x^2\}
Answer
In each case, that the set is independent must be proved, and that it is dependent must be shown by exhibiting a specific dependence.
  1. This set is dependent. The familiar relation \sin^2(x)+\cos^2(x)=1 shows that 2=c_1\cdot(4\sin^2(x))+c_2\cdot(\cos^2(x)) is satisfied by c_1=1/2 and c_2=2.
  2. This set is independent. Consider the relationship c_1\cdot 1+c_2\cdot\sin(x)+c_3\cdot\sin(2x)=0 (that "0" is the zero function). Taking x=0x=\pi/2 and x=\pi/4gives this system.
    
\begin{array}{*{3}{rc}r}
c_1 &   &                &   &      &=  &0  \\
c_1  &+  &c_2             &   &      &=  &0  \\
c_1  &+  &(\sqrt{2}/2)c_2 &+  &c_3   &=  &0
\end{array}
    whose only solution is c_1=0c_2=0, and c_3=0.
  3. By inspection, this set is independent. Any dependence \cos(x)=c\cdot x is not possible since the cosine function is not a multiple of the identity function (we are applyingCorollary 1.17).
  4. By inspection, we spot that there is a dependence. Because (1+x)^2=x^2+2x+1, we get that c_1\cdot(1+x)^2+c_2\cdot(x^2+2x)=3 is satisfied by c_1=3 and c_2=-3.
  5. This set is dependent. The easiest way to see that is to recall the trigonometric relationship \cos^2(x)-\sin^2(x)=\cos(2x). (Remark. A person who doesn't recall this, and tries some x's, simply never gets a system leading to a unique solution, and never gets to conclude that the set is independent. Of course, this person might wonder if they simply never tried the right set of x's, but a few tries will lead most people to look instead for a dependence.)
  6. This set is dependent, because it contains the zero object in the vector space, the zero polynomial.
Problem 5
Does the equation  \sin^2(x)/\cos^2(x)=\tan^2(x)  show that this set of functions  \{\sin^2(x),\cos^2(x),\tan^2(x)\}  is a linearly dependent subset of the set of all real-valued functions with domain the interval  (-\pi/2..\pi/2)  of real numbers between  -\pi/2  and  \pi/2) ?
Answer
No, that equation is not a linear relationship. In fact this set is independent, as the system arising from taking  x  to be  0  \pi/6  and  \pi/4  shows.
Problem 6
Why does Lemma 1.4 say "distinct"?
Answer
To emphasize that the equation  1\cdot\vec{s}+(-1)\cdot\vec{s}=\vec{0} does not make the set dependent.

Problem 7
Show that the nonzero rows of an echelon form matrix form a linearly independent set.
Answer
We have already showed this: the Linear Combination Lemma and its corollary state that in an echelon form matrix, no nonzero row is a linear combination of the others.
Problem 8
  1. Show that if the set  \{\vec{u},\vec{v},\vec{w}\}  is linearly independent set then so is the set  \{\vec{u},\vec{u}+\vec{v},\vec{u}+\vec{v}+\vec{w}\} .
  2. What is the relationship between the linear independence or dependence of the set  \{\vec{u},\vec{v},\vec{w}\}  and the independence or dependence of  \{\vec{u}-\vec{v},\vec{v}-\vec{w},\vec{w}-\vec{u}\} ?
Answer
  1. Assume that the set  \{\vec{u},\vec{v},\vec{w}\}  is linearly independent, so that any relationship d_0\vec{u}+d_1\vec{v}+d_2\vec{w}=\vec{0} leads to the conclusion that d_0=0d_1=0, and d_2=0. Consider the relationship  c_1(\vec{u})+c_2(\vec{u}+\vec{v})+c_3(\vec{u}+\vec{v}+\vec{w}) =\vec{0} . Rewrite it to get  (c_1+c_2+c_3)\vec{u}+(c_2+c_3)\vec{v}+(c_3)\vec{w}=\vec{0} . Takingd_0 to be c_1+c_2+c_3, taking d_1 to be c_2+c_3, and taking d_2 to be c_3 we have this system.
    
\begin{array}{*{3}{rc}r}
c_1  &+  &c_2  &+  &c_3  &=  &0  \\
&   &c_2  &+  &c_3  &=  &0  \\
&   &     &   &c_3  &=  &0
\end{array}
    Conclusion: the c's are all zero, and so the set is linearly independent.
  2. The second set is dependent
    
1\cdot(\vec{u}-\vec{v})
+1\cdot(\vec{v}-\vec{w})
+1\cdot(\vec{w}-\vec{u})
=\vec{0}
    whether or not the first set is independent.
Problem 9
Example 1.10 shows that the empty set is linearly independent.
  1. When is a one-element set linearly independent?
  2. How about a set with two elements?
Answer
  1. A singleton set \{\vec{v}\} is linearly independent if and only if \vec{v}\neq\vec{0}. For the "if" direction, with \vec{v}\neq\vec{0}, we can apply Lemma 1.4 by considering the relationship  c\cdot\vec{v}=\vec{0}  and noting that the only solution is the trivial one: c=0. For the "only if" direction, just recall that Example 1.11 shows that \{\vec{0}\} is linearly dependent, and so if the set \{\vec{v}\} is linearly independent then \vec{v}\neq\vec{0}. (Remark. Another answer is to say that this is the special case of Lemma 1.16 where  S=\varnothing .)
  2. A set with two elements is linearly independent if and only if neither member is a multiple of the other (note that if one is the zero vector then it is a multiple of the other, so this case is covered). This is an equivalent statement: a set is linearly dependent if and only if one element is a multiple of the other. The proof is easy. A set \{\vec{v}_1,\vec{v}_2\} is linearly dependent if and only if there is a relationship c_1\vec{v}_1+c_2\vec{v}_2=\vec{0} with either c_1\neq 0 or c_2\neq 0 (or both). That holds if and only if \vec{v}_1=(-c_2/c_1)\vec{v}_2 or \vec{v}_2=(-c_1/c_2)\vec{v}_1 (or both).
Problem 10
In any vector space  V , the empty set is linearly independent. What about all of  V ?
Answer
This set is linearly dependent set because it contains the zero vector.
Problem 11
Show that if  \{\vec{x},\vec{y},\vec{z}\}  is linearly independent then so are all of its proper subsets:  \{\vec{x},\vec{y}\}  \{\vec{x},\vec{z}\}  \{\vec{y},\vec{z}\}  \{\vec{x}\} , \{\vec{y}\}  \{\vec{z}\} , and  \{\} . Is that "only if" also?
Answer
The "if" half is given by Lemma 1.14. The converse (the "only if" statement) does not hold. An example is to consider the vector space  \mathbb{R}^2  and these vectors.

\vec{x}=\begin{pmatrix} 1 \\ 0 \end{pmatrix},\quad
\vec{y}=\begin{pmatrix} 0 \\ 1 \end{pmatrix},\quad
\vec{z}=\begin{pmatrix} 1 \\ 1 \end{pmatrix}
Problem 12
  1. Show that this
    
S=\{\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix},\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}\}
    is a linearly independent subset of  \mathbb{R}^3 .
  2. Show that
    
\begin{pmatrix} 3 \\ 2 \\ 0 \end{pmatrix}
    is in the span of S by finding  c_1  and  c_2  giving a linear relationship.
    
c_1\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}
+c_2\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}
=\begin{pmatrix} 3 \\ 2 \\ 0 \end{pmatrix}
    Show that the pair  c_1,c_2  is unique.
  3. Assume that  S  is a subset of a vector space and that  \vec{v} is in  [S] , so that  \vec{v}  is a linear combination of vectors from  S . Prove that if  S  is linearly independent then a linear combination of vectors from  S  adding to  \vec{v}  is unique (that is, unique up to reordering and adding or taking away terms of the form  0\cdot\vec{s} ). Thus  S  as a spanning set is minimal in this strong sense: each vector in  [S]  is "hit" a minimum number of times— only once.
  4. Prove that it can happen when  S  is not linearly independent that distinct linear combinations sum to the same vector.
Answer
  1. The linear system arising from
    
c_1\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}
+c_2\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}
=\begin{pmatrix} 0 \\ 0 \\ 0 \end{pmatrix}
    has the unique solution  c_1=0  and  c_2=0 .
  2. The linear system arising from
    
c_1\begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}
+c_2\begin{pmatrix} -1 \\ 2 \\ 0 \end{pmatrix}
=\begin{pmatrix} 3 \\ 2 \\ 0 \end{pmatrix}
    has the unique solution  c_1=8/3  and  c_2=-1/3 .
  3. Suppose that  S  is linearly independent. Suppose that we have both \vec{v}=c_1\vec{s}_1+\dots+c_n\vec{s}_n and \vec{v}=d_1\vec{t}_1+\dots+d_m\vec{t}_m (where the vectors are members of S). Now,
    
c_1\vec{s}_1+\dots+c_n\vec{s}_n
=\vec{v}
=d_1\vec{t}_1+\dots+d_m\vec{t}_m
    can be rewritten in this way.
    
c_1\vec{s}_1+\dots+c_n\vec{s}_n
-d_1\vec{t}_1-\dots-d_m\vec{t}_m
=\vec{0}
    Possibly some of the \vec{s}\,'s equal some of the \vec{t}\,'s; we can combine the associated coefficients (i.e., if \vec{s}_i=\vec{t}_j then \cdots+c_i\vec{s}_i+\dots-d_j\vec{t}_j-\cdots can be rewritten as \cdots+(c_i-d_j)\vec{s}_i+\cdots). That equation is a linear relationship among distinct (after the combining is done) members of the set S. We've assumed that S is linearly independent, so all of the coefficients are zero. If i is such that \vec{s}_i does not equal any \vec{t}_j then c_i is zero. If j is such that \vec{t}_j does not equal any \vec{s}_i then d_j is zero. In the final case, we have that c_i-d_j=0 and so c_i=d_j. Therefore, the original two sums are the same, except perhaps for some 0\cdot\vec{s}_i or 0\cdot\vec{t}_j terms that we can neglect.
  4. This set is not linearly independent:
    
S=\{\begin{pmatrix} 1 \\ 0 \end{pmatrix},\begin{pmatrix} 2 \\ 0 \end{pmatrix}\}\subset\mathbb{R}^2
    and these two linear combinations give the same result
    
\begin{pmatrix} 0 \\ 0 \end{pmatrix}=2\cdot\begin{pmatrix} 1 \\ 0 \end{pmatrix}-1\cdot\begin{pmatrix} 2 \\ 0 \end{pmatrix} =4\cdot\begin{pmatrix} 1 \\ 0 \end{pmatrix}-2\cdot\begin{pmatrix} 2 \\ 0 \end{pmatrix}
    Thus, a linearly dependent set might have indistinct sums. In fact, this stronger statement holds: if a set is linearly dependent then it must have the property that there are two distinct linear combinations that sum to the same vector. Briefly, where  c_1\vec{s}_1+\dots+c_n\vec{s}_n=\vec{0}  then multiplying both sides of the relationship by two gives another relationship. If the first relationship is nontrivial then the second is also.
Problem 13
Prove that a polynomial gives rise to the zero function if and only if it is the zero polynomial. (Comment. This question is not a Linear Algebra matter, but we often use the result. A polynomial gives rise to a function in the obvious way: x\mapsto c_nx^n+\dots+c_1x+c_0.)
Answer
In this "if and only if" statement, the "if" half is clear— if the polynomial is the zero polynomial then the function that arises from the action of the polynomial must be the zero function x\mapsto 0. For "only if" we write p(x)=c_nx^n+\dots+c_0. Plugging in zero p(0)=0gives that c_0=0. Taking the derivative and plugging in zero p^\prime(0)=0 gives that c_1=0. Similarly we get that each c_i is zero, and p is the zero polynomial.