20/09/2014

Linearly Dependent & Independent

                                                  Linearly Dependent & Independent


Linearly Dependent:
A subset S of a vector space V is called linearly dependent if there exist a finite number of distinct vectors v1v2, ..., vn in S and scalars a1a2, ..., an, not all zero, such that
 a_1 v_1 + a_2 v_2 + \cdots + a_n v_n = 0.
Note that the zero on the right is the zero vector, not the number zero.
For any vectors u1u2, ..., un we have that
 0 u_1 + 0 u_2 + \cdots + 0 u_n = 0,
This is called the trivial representation of 0 as a linear combination of u1u2, ..., un, this motivates a very simple definition of both linear independence and linear dependence, for a set to be linearly dependent, there must exist a non-trivial representation of 0 as a linear combination of vectors in the set.

Linearly Independent:
A subset S of a vector space V is then said to be linearly independent if it is not linearly dependent, in other words, a set is linearly independent if the only representations of 0 as a linear combination of its vectors are trivial representations.
Note that in both definitions we also say that the vectors in the subset S are linearly dependent or linearly independent.
More generally, let V be a vector space over a field K, and let {vi | iI} be a family of elements of V. The family is linearly dependent over K if there exists a family {aj | jJ} of elements of K, not all zero, such that
 \sum_{j \in J} a_j v_j = 0 \,
where the index set J is a nonempty, finite subset of I.
A set X of elements of V is linearly independent if the corresponding family {x}xX is linearly independent.