In linear algebra, a set of elements of a vector space is **linearly independent** if none of the vectors in the set can be written as a linear combination of **finitely** many other vectors in the set. For instance, in three-dimensional Euclidean space **R**^{3}, the three vectors (1, 0, 0), (0, 1, 0) and (0, 0, 1) are linearly independent, while (2, −1, 1), (1, 0, 1) and (3, −1, 2) are not (since the third vector is the sum of the first two). Vectors which are not linearly independent are called **linearly dependent**. ## Definition
Let *V* be a vector space over a field *K*. If *v*_{1}, *v*_{2}, ..., *v*_{n} are elements of *V*, we say that they are *linearly dependent* over *K* if there exist elements *a*_{1}, *a*_{2}, ..., *a*_{n} in *K* not all equal to zero such that: or, more concisely: (Note that the zero on the right is the zero element in *V*, not the zero element in *K*.) If there do not exist such field elements, then we say that *v*_{1}, *v*_{2}, ..., *v*_{n} are *linearly independent*. An infinite subset of *V* is said to be linearly independent if all its finite subsets are linearly independent. To focus the definition on linear independence, we can say that the vectors *v*_{1}, *v*_{2}, ..., *v*_{n} are *linearly independent*, if and only if the following condition is satisfied: Whenever *a*_{1}, *a*_{2}, ..., *a*_{n} are elements of *K* such that: *a*_{1}*v*_{1} + *a*_{2}*v*_{2} + ... + *a*_{n}*v*_{n} = 0 then *a*_{i} = 0 for *i* = 1, 2, ..., *n*. The concept of linear independence is important because a set of vectors which is linearly independent and spans some vector space, forms a basis for that vector space.
## The projective space of linear dependences A **linear dependence** among vectors *v*_{1}, ..., *v*_{n} is a vector (*a*_{1}, ..., *a*_{n}) with *n* scalar components, not all zero, such that If such a linear dependence exists, then the *n* vectors are linearly dependent. It makes sense to identify two linear dependences if one arises as a non-zero multiple of the other, because in this case the two describe the same linear relationship among the vectors. Under this identification, the set of all linear dependences among *v*_{1}, ...., *v*_{n} is a projective space.
## Example I The vectors (1, 1) and (−3, 2) in **R**^{2} are linearly independent. Proof: Let *a*, *b* be two real numbers such that: Then: - and
- and
Solving for *a* and *b*, we find that *a* = 0 and *b* = 0.
## Example II Let V=**R**^{n} and consider the following elements in V: Then e_{1},e_{2},...,e_{n} are linearly independent. Proof: Suppose that a_{1}, a_{2},...,a_{n} are elements of **R**^{n} such that Since
then a_{i} = 0 for all i in {1, .., n}.
## Example III: (calculus required) Let *V* be the vector space of all functions of a real variable *t*. Then the functions e^{t} and e^{2t} in *V* are linearly independent. Proof: Suppose *a* and *b* are two real numbers such that -
**(1)** *for all values of t*. We need to show that *a* = 0 and *b* = 0. In order to do this, we differentiate both sides of **(1)** to get -
**(2)** which also holds for all values of *t*. Subtracting the first relation from the second relation, we obtain: and, by plugging in *t* = 0, we get *b* = 0. From the first relation we then get: and again for *t* = 0 we find *a* = 0.
## See also |