If the vectors are linearly dependent then. Linear dependence of vectors. Basis of the vector system. Properties of linear dependence and independence


The concepts of linear dependence and independence of a system of vectors are very important when studying vector algebra, since the concepts of dimension and basis of space are based on them. In this article we will give definitions, consider the properties of linear dependence and independence, obtain an algorithm for studying a system of vectors for linear dependence, and analyze in detail the solutions of examples.

Page navigation.

Determination of linear dependence and linear independence of a system of vectors.

Let's consider a set of p n-dimensional vectors, denote them as follows. Let's make a linear combination of these vectors and arbitrary numbers (real or complex): . Based on the definition of operations on n-dimensional vectors, as well as the properties of the operations of adding vectors and multiplying a vector by a number, it can be argued that the written linear combination represents some n-dimensional vector, that is, .

This is how we approached the definition of the linear dependence of a system of vectors.

Definition.

If a linear combination can represent a zero vector then when among the numbers there is at least one non-zero, then the system of vectors is called linearly dependent.

Definition.

If a linear combination is a zero vector only when all numbers are equal to zero, then the system of vectors is called linearly independent.

Properties of linear dependence and independence.

Based on these definitions, we formulate and prove properties of linear dependence and linear independence of a system of vectors.

    If several vectors are added to a linearly dependent system of vectors, the resulting system will be linearly dependent.

    Proof.

    Since the system of vectors is linearly dependent, equality is possible if there is at least one non-zero number from the numbers . Let .

    Let's add s more vectors to the original system of vectors , and we obtain the system . Since and , then the linear combination of vectors of this system is of the form

    represents the zero vector, and . Consequently, the resulting system of vectors is linearly dependent.

    If several vectors are excluded from a linearly independent system of vectors, then the resulting system will be linearly independent.

    Proof.

    Let us assume that the resulting system is linearly dependent. By adding all the discarded vectors to this system of vectors, we obtain the original system of vectors. By condition, it is linearly independent, but due to the previous property of linear dependence, it must be linearly dependent. We have arrived at a contradiction, therefore our assumption is incorrect.

    If a system of vectors has at least one zero vector, then such a system is linearly dependent.

    Proof.

    Let the vector in this system of vectors be zero. Let us assume that the original system of vectors is linearly independent. Then vector equality is possible only when . However, if we take any , different from zero, then the equality will still be true, since . Consequently, our assumption is incorrect, and the original system of vectors is linearly dependent.

    If a system of vectors is linearly dependent, then at least one of its vectors is linearly expressed in terms of the others. If a system of vectors is linearly independent, then none of the vectors can be expressed in terms of the others.

    Proof.

    First, let's prove the first statement.

    Let the system of vectors be linearly dependent, then there is at least one nonzero number and the equality is true. This equality can be resolved with respect to , since in this case we have

    Consequently, the vector is linearly expressed through the remaining vectors of the system, which is what needed to be proved.

    Now let's prove the second statement.

    Since the system of vectors is linearly independent, equality is possible only for .

    Let us assume that some vector of the system is expressed linearly in terms of the others. Let this vector be , then . This equality can be rewritten as , on its left side there is a linear combination of system vectors, and the coefficient in front of the vector is different from zero, which indicates a linear dependence of the original system of vectors. So we came to a contradiction, which means the property is proven.

An important statement follows from the last two properties:
if a system of vectors contains vectors and , where is an arbitrary number, then it is linearly dependent.

Study of a system of vectors for linear dependence.

Let's pose a problem: we need to establish a linear dependence or linear independence of a system of vectors.

The logical question is: “how to solve it?”

Something useful from a practical point of view can be learned from the definitions and properties of linear dependence and independence of a system of vectors discussed above. These definitions and properties allow us to establish a linear dependence of a system of vectors in the following cases:

What to do in other cases, which are the majority?

Let's figure this out.

Let us recall the formulation of the theorem on the rank of a matrix, which we presented in the article.

Theorem.

Let r – rank of matrix A of order p by n, . Let M be the basis minor of the matrix A. All rows (all columns) of the matrix A that do not participate in the formation of the basis minor M are linearly expressed through the rows (columns) of the matrix generating the basis minor M.

Now let us explain the connection between the theorem on the rank of a matrix and the study of a system of vectors for linear dependence.

Let's compose a matrix A, the rows of which will be the vectors of the system under study:

What would linear independence of a system of vectors mean?

From the fourth property of linear independence of a system of vectors, we know that none of the vectors of the system can be expressed in terms of the others. In other words, no row of matrix A will be linearly expressed in terms of other rows, therefore, linear independence of the system of vectors will be equivalent to the condition Rank(A)=p.

What will the linear dependence of the system of vectors mean?

Everything is very simple: at least one row of the matrix A will be linearly expressed in terms of the others, therefore, linear dependence of the system of vectors will be equivalent to the condition Rank(A)

.

So, the problem of studying a system of vectors for linear dependence is reduced to the problem of finding the rank of a matrix composed of vectors of this system.

It should be noted that for p>n the system of vectors will be linearly dependent.

Comment: when compiling matrix A, the vectors of the system can be taken not as rows, but as columns.

Algorithm for studying a system of vectors for linear dependence.

Let's look at the algorithm using examples.

Examples of studying a system of vectors for linear dependence.

Example.

A system of vectors is given. Examine it for linear dependence.

Solution.

Since the vector c is zero, the original system of vectors is linearly dependent due to the third property.

Answer:

The vector system is linearly dependent.

Example.

Examine a system of vectors for linear dependence.

Solution.

It is not difficult to notice that the coordinates of the vector c are equal to the corresponding coordinates of the vector multiplied by 3, that is, . Therefore, the original system of vectors is linearly dependent.

Linear dependence and vector independence

Definitions of linearly dependent and independent vector systems

Definition 22

Let us have a system of n-vectors and a set of numbers
, Then

(11)

is called a linear combination of a given system of vectors with a given set of coefficients.

Definition 23

Vector system
is called linearly dependent if there is such a set of coefficients
, of which at least one is not equal to zero, that the linear combination of a given system of vectors with this set of coefficients is equal to the zero vector:

Let
, Then

Definition 24 ( through the representation of one vector of the system as a linear combination of the others)

Vector system
is called linearly dependent if at least one of the vectors of this system can be represented as a linear combination of the remaining vectors of this system.

Statement 3

Definitions 23 and 24 are equivalent.

Definition 25(via zero linear combination)

Vector system
is called linearly independent if a zero linear combination of this system is possible only for all
equal to zero.

Definition 26(due to the impossibility of representing one vector of the system as a linear combination of the others)

Vector system
is called linearly independent if not one of the vectors of this system cannot be represented as a linear combination of other vectors of this system.

Properties of linearly dependent and independent vector systems

Theorem 2 (zero vector in the system of vectors)

If a system of vectors has a zero vector, then the system is linearly dependent.

 Let
, Then .

We get
, therefore, by definition of a linearly dependent system of vectors through a zero linear combination (12) the system is linearly dependent. 

Theorem 3 (dependent subsystem in a vector system)

If a system of vectors has a linearly dependent subsystem, then the entire system is linearly dependent.

 Let
- linearly dependent subsystem
, among which at least one is not equal to zero:

This means, by definition 23, the system is linearly dependent. 

Theorem 4

Any subsystem of a linearly independent system is linearly independent.

 From the opposite. Let the system be linearly independent and have a linearly dependent subsystem. But then, according to Theorem 3, the entire system will also be linearly dependent. Contradiction. Consequently, a subsystem of a linearly independent system cannot be linearly dependent. 

Geometric meaning of linear dependence and independence of a system of vectors

Theorem 5

Two vectors And are linearly dependent if and only if
.

Necessity.

And - linearly dependent
that the condition is satisfied
. Then
, i.e.
.

Adequacy.

Linearly dependent. 

Corollary 5.1

The zero vector is collinear to any vector

Corollary 5.2

In order for two vectors to be linearly independent, it is necessary and sufficient that was not collinear .

Theorem 6

In order for a system of three vectors to be linearly dependent, it is necessary and sufficient that these vectors be coplanar .

Necessity.

- are linearly dependent, therefore, one vector can be represented as a linear combination of the other two.

, (13)

Where
And
. According to the parallelogram rule there is a diagonal of a parallelogram with sides
, but a parallelogram is a flat figure
coplanar
- are also coplanar.

Adequacy.

- coplanar. Let's apply three vectors to point O:

C

B`

– linearly dependent 

Corollary 6.1

The zero vector is coplanar to any pair of vectors.

Corollary 6.2

In order for vectors
were linearly independent, it is necessary and sufficient that they are not coplanar.

Corollary 6.3

Any vector of a plane can be represented as a linear combination of any two non-collinear vectors of the same plane.

Theorem 7

Any four vectors in space are linearly dependent .

 Let's consider 4 cases:

Let's draw a plane through vectors, then a plane through vectors and a plane through vectors. Then we draw planes passing through point D, parallel to the pairs of vectors ; ; respectively. We build a parallelepiped along the lines of intersection of planes O.B. 1 D 1 C 1 ABDC.

Let's consider O.B. 1 D 1 C 1 – parallelogram by construction according to the parallelogram rule
.

Consider OADD 1 – a parallelogram (from the property of a parallelepiped)
, Then

EMBED Equation.3 .

By Theorem 1
such that . Then
, and by definition 24 the system of vectors is linearly dependent. 

Corollary 7.1

The sum of three non-coplanar vectors in space is a vector that coincides with the diagonal of a parallelepiped built on these three vectors applied to a common origin, and the origin of the sum vector coincides with the common origin of these three vectors.

Corollary 7.2

If we take 3 non-coplanar vectors in space, then any vector of this space can be decomposed into a linear combination of these three vectors.

In other words, the linear dependence of a group of vectors means that there is a vector among them that can be represented by a linear combination of other vectors in this group.

Let's say. Then

Therefore the vector x linearly dependent of the vectors of this group.

Vectors x, y, ..., z are called linear independent vectors, if it follows from equality (0) that

α=β= ...= γ=0.

That is, groups of vectors are linearly independent if no vector can be represented by a linear combination of other vectors in this group.

Determination of linear dependence of vectors

Let m string vectors of order n be given:

Having made a Gaussian exception, we reduce matrix (2) to upper triangular form. The elements of the last column change only when the rows are rearranged. After m elimination steps we get:

Where i 1 , i 2 , ..., i m - row indices obtained by possible permutation of rows. Considering the resulting rows from the row indices, we exclude those that correspond to the zero row vector. The remaining lines form linearly independent vectors. Note that when composing matrix (2), by changing the sequence of row vectors, you can obtain another group of linearly independent vectors. But the subspace that both these groups of vectors form coincides.

Definition. Linear combination of vectors a 1 , ..., a n with coefficients x 1 , ..., x n is called a vector

x 1 a 1 + ... + x n a n .

trivial, if all coefficients x 1 , ..., x n are equal to zero.

Definition. The linear combination x 1 a 1 + ... + x n a n is called non-trivial, if at least one of the coefficients x 1, ..., x n is not equal to zero.

linearly independent, if there is no non-trivial combination of these vectors equal to the zero vector.

That is, the vectors a 1, ..., a n are linearly independent if x 1 a 1 + ... + x n a n = 0 if and only if x 1 = 0, ..., x n = 0.

Definition. The vectors a 1, ..., a n are called linearly dependent, if there is a non-trivial combination of these vectors equal to the zero vector.

Properties of linearly dependent vectors:

    For 2 and 3 dimensional vectors.

    Two linearly dependent vectors are collinear. (Collinear vectors are linearly dependent.)

    For 3-dimensional vectors.

    Three linearly dependent vectors are coplanar. (Three coplanar vectors are linearly dependent.)

  • For n-dimensional vectors.

    n + 1 vectors are always linearly dependent.

Examples of problems on linear dependence and linear independence of vectors:

Example 1. Check whether the vectors a = (3; 4; 5), b = (-3; 0; 5), c = (4; 4; 4), d = (3; 4; 0) are linearly independent.

Solution:

The vectors will be linearly dependent, since the dimension of the vectors is less than the number of vectors.

Example 2. Check whether the vectors a = (1; 1; 1), b = (1; 2; 0), c = (0; -1; 1) are linearly independent.

Solution:

x 1 + x 2 = 0
x 1 + 2x 2 - x 3 = 0
x 1 + x 3 = 0
1 1 0 0 ~
1 2 -1 0
1 0 1 0
~ 1 1 0 0 ~ 1 1 0 0 ~
1 - 1 2 - 1 -1 - 0 0 - 0 0 1 -1 0
1 - 1 0 - 1 1 - 0 0 - 0 0 -1 1 0

subtract the second from the first line; add a second line to the third line:

~ 1 - 0 1 - 1 0 - (-1) 0 - 0 ~ 1 0 1 0
0 1 -1 0 0 1 -1 0
0 + 0 -1 + 1 1 + (-1) 0 + 0 0 0 0 0

This solution shows that the system has many solutions, that is, there is a non-zero combination of values ​​of the numbers x 1, x 2, x 3 such that the linear combination of vectors a, b, c is equal to the zero vector, for example:

A + b + c = 0

which means the vectors a, b, c are linearly dependent.

Answer: vectors a, b, c are linearly dependent.

Example 3. Check whether the vectors a = (1; 1; 1), b = (1; 2; 0), c = (0; -1; 2) are linearly independent.

Solution: Let us find the values ​​of the coefficients at which the linear combination of these vectors will be equal to the zero vector.

x 1 a + x 2 b + x 3 c 1 = 0

This vector equation can be written as a system of linear equations

x 1 + x 2 = 0
x 1 + 2x 2 - x 3 = 0
x 1 + 2x 3 = 0

Let's solve this system using the Gauss method

1 1 0 0 ~
1 2 -1 0
1 0 2 0

subtract the first from the second line; subtract the first from the third line:

~ 1 1 0 0 ~ 1 1 0 0 ~
1 - 1 2 - 1 -1 - 0 0 - 0 0 1 -1 0
1 - 1 0 - 1 2 - 0 0 - 0 0 -1 2 0

subtract the second from the first line; add a second to the third line.

Task 1. Find out whether the system of vectors is linearly independent. The system of vectors will be specified by the matrix of the system, the columns of which consist of the coordinates of the vectors.

.

Solution. Let the linear combination equal to zero. Having written this equality in coordinates, we obtain the following system of equations:

.

Such a system of equations is called triangular. She has only one solution . Therefore, the vectors linearly independent.

Task 2. Find out whether the system of vectors is linearly independent.

.

Solution. Vectors are linearly independent (see Problem 1). Let us prove that the vector is a linear combination of vectors . Vector expansion coefficients are determined from the system of equations

.

This system, like a triangular one, has a unique solution.

Therefore, the system of vectors linearly dependent.

Comment. Matrices of the same type as in Problem 1 are called triangular , and in problem 2 – stepped triangular . The question of the linear dependence of a system of vectors is easily solved if the matrix composed of the coordinates of these vectors is step triangular. If the matrix does not have a special form, then using elementary string conversions , preserving linear relationships between the columns, it can be reduced to a step-triangular form.

Elementary string conversions matrices (EPS) the following operations on a matrix are called:

1) rearrangement of lines;

2) multiplying a string by a non-zero number;

3) adding another string to a string, multiplied by an arbitrary number.

Task 3. Find the maximum linearly independent subsystem and calculate the rank of the system of vectors

.

Solution. Let us reduce the matrix of the system using EPS to a step-triangular form. To explain the procedure, we denote the line with the number of the matrix to be transformed by the symbol . The column after the arrow indicates the actions on the rows of the matrix being converted that must be performed to obtain the rows of the new matrix.


.

Obviously, the first two columns of the resulting matrix are linearly independent, the third column is their linear combination, and the fourth does not depend on the first two. Vectors are called basic. They form a maximal linearly independent subsystem of the system , and the rank of the system is three.



Basis, coordinates

Task 4. Find the basis and coordinates of the vectors in this basis on the set of geometric vectors whose coordinates satisfy the condition .

Solution. The set is a plane passing through the origin. An arbitrary basis on a plane consists of two non-collinear vectors. The coordinates of the vectors in the selected basis are determined by solving the corresponding system of linear equations.

There is another way to solve this problem, when you can find the basis using the coordinates.

Coordinates spaces are not coordinates on the plane, since they are related by the relation , that is, they are not independent. The independent variables and (they are called free) uniquely define a vector on the plane and, therefore, they can be chosen as coordinates in . Then the basis consists of vectors lying in and corresponding to sets of free variables And , that is .

Task 5. Find the basis and coordinates of the vectors in this basis on the set of all vectors in space whose odd coordinates are equal to each other.

Solution. Let us choose, as in the previous problem, coordinates in space.

Because , then free variables uniquely determine the vector from and are therefore coordinates. The corresponding basis consists of vectors.

Task 6. Find the basis and coordinates of the vectors in this basis on the set of all matrices of the form , Where – arbitrary numbers.

Solution. Each matrix from is uniquely representable in the form:

This relation is the expansion of the vector from with respect to the basis
with coordinates .

Task 7. Find the dimension and basis of the linear hull of a system of vectors

.

Solution. Using the EPS, we transform the matrix from the coordinates of the system vectors to a step-triangular form.




.

Columns the last matrices are linearly independent, and the columns linearly expressed through them. Therefore, the vectors form a basis , And .

Comment. Basis in is chosen ambiguously. For example, vectors also form a basis .

Share