System of linear equations


In mathematics, a system of linear equations is a collection of two or more linear equations involving the same variables.
For example,
is a system of three equations in the three variables. A solution to a linear system is an assignment of values to the variables such that all the equations are simultaneously satisfied. In the example above, a solution is given by the ordered triple
since it makes all three equations valid.
Linear systems are a fundamental part of linear algebra, a subject used in most modern mathematics. Computational algorithms for finding the solutions are an important part of numerical linear algebra, and play a prominent role in engineering, physics, chemistry, computer science, and economics. A system of non-linear equations can often be approximated by a linear system, a helpful technique when making a mathematical model or computer simulation of a relatively complex system.
Very often, and in this article, the coefficients and solutions of the equations are constrained to be real or complex numbers, but the theory and algorithms apply to coefficients and solutions in any field. For other algebraic structures, other theories have been developed. For coefficients and solutions in an integral domain, such as the ring of integers, see Linear equation over a ring. For coefficients and solutions that are polynomials, see Gröbner basis. For finding the "best" integer solutions among many, see Integer linear programming. For an example of a more exotic structure to which linear algebra can be applied, see Tropical geometry.

Elementary examples

Trivial example

The system of one equation in one unknown
has the solution
However, most interesting linear systems have at least two equations.

Simple nontrivial example

The simplest kind of nontrivial linear system involves two equations and two variables:
One method for solving such a system is as follows. First, solve the top equation for in terms of :
Now substitute this expression for x into the bottom equation:
This results in a single equation involving only the variable. Solving gives, and substituting this back into the equation for yields. This method generalizes to systems with additional variables

General form

A general system of m linear equations with n unknowns and coefficients can be written as
where are the unknowns, are the coefficients of the system, and are the constant terms.
Often the coefficients and unknowns are real or complex numbers, but integers and rational numbers are also seen, as are polynomials and elements of an abstract algebraic structure.

Vector equation

One extremely helpful view is that each unknown is a weight for a column vector in a linear combination.
This allows all the language and theory of vector spaces to be brought to bear. For example, the collection of all possible linear combinations of the vectors on the left-hand side is called their span, and the equations have a solution just when the right-hand vector is within that span. If every vector within that span has exactly one expression as a linear combination of the given left-hand vectors, then any solution is unique. In any event, the span has a basis of linearly independent vectors that do guarantee exactly one expression; and the number of vectors in that basis cannot be larger than m or n, but it can be smaller. This is important because if we have m independent vectors a solution is guaranteed regardless of the right-hand side, and otherwise not guaranteed.

Matrix equation

The vector equation is equivalent to a matrix equation of the form
where A is an m×''n matrix, x is a column vector with n'' entries, and b is a column vector with m entries.
The number of vectors in a basis for the span of the columns vectors in A is now expressed as the rank of the matrix.

Solution set

A solution of a linear system is an assignment of values to the variables
such that each of the equations is satisfied. The set of all possible solutions is called the solution set.
A linear system may behave in any one of three possible ways:
  1. The system has infinitely many solutions.
  2. The system has a unique solution.
  3. The system has no solution.

    Geometric interpretation

For a system involving two variables, each linear equation determines a line on the xy-plane. Because a solution to a linear system must satisfy all of the equations, the solution set is the intersection of these lines, and is hence either a line, a single point, or the empty set.
For three variables, each linear equation determines a plane in three-dimensional space, and the solution set is the intersection of these planes. Thus the solution set may be a plane, a line, a single point, or the empty set. For example, as three parallel planes do not have a common point, the solution set of their equations is empty; the solution set of the equations of three planes intersecting at a point is single point; if three planes pass through two points, their equations have at least two common solutions; in fact the solution set is infinite and consists in all the line passing through these points.
For n variables, each linear equation determines a hyperplane in n-dimensional space. The solution set is the intersection of these hyperplanes, and is a flat, which may have any dimension lower than n.

General behavior

In general, the behavior of a linear system is determined by the relationship between the number of equations and the number of unknowns. Here, "in general" means that a different behavior may occur for specific values of the coefficients of the equations.
  • In general, a system with fewer equations than unknowns has infinitely many solutions, but it may have no solution. Such a system is known as an underdetermined system.
  • In general, a system with the same number of equations and unknowns has a single unique solution.
  • In general, a system with more equations than unknowns has no solution. Such a system is also known as an overdetermined system.
In the first case, the dimension of the solution set is, in general, equal to, where n is the number of variables and m is the number of equations.
The following pictures illustrate this trichotomy in the case of two variables:
The first system has infinitely many solutions, namely all of the points on the blue line. The second system has a single unique solution, namely the intersection of the two lines. The third system has no solutions, since the three lines share no common point.
It must be kept in mind that the pictures above show only the most common case. It is possible for a system of two equations and two unknowns to have no solution, or for a system of three equations and two unknowns to be solvable.
A system of linear equations behave differently from the general case if the equations are linearly dependent, or if it is [|inconsistent] and has no more equations than unknowns.

Properties

Independence

The equations of a linear system are independent if none of the equations can be derived algebraically from the others. When the equations are independent, each equation contains new information about the variables, and removing any of the equations increases the size of the solution set. For linear equations, logical independence is the same as linear independence.
For example, the equations
are not independent — they are the same equation when scaled by a factor of two, and they would produce identical graphs. This is an example of equivalence in a system of linear equations.
For a more complicated example, the equations
are not independent, because the third equation is the sum of the other two. Indeed, any one of these equations can be derived from the other two, and any one of the equations can be removed without affecting the solution set. The graphs of these equations are three lines that intersect at a single point.

Consistency

A linear system is inconsistent if it has no solution, and otherwise, it is said to be consistent. When the system is inconsistent, it is possible to derive a contradiction from the equations, that may always be rewritten as the statement.
For example, the equations
are inconsistent. In fact, by subtracting the first equation from the second one and multiplying both sides of the result by 1/6, we get. The graphs of these equations on the xy-plane are a pair of parallel lines.
It is possible for three linear equations to be inconsistent, even though any two of them are consistent together. For example, the equations
are inconsistent. Adding the first two equations together gives, which can be subtracted from the third equation to yield. Any two of these equations have a common solution. The same phenomenon can occur for any number of equations.
In general, inconsistencies occur if the left-hand sides of the equations in a system are linearly dependent, and the constant terms do not satisfy the dependence relation. A system of equations whose left-hand sides are linearly independent is always consistent.
Putting it another way, according to the Rouché–Capelli theorem, any system of equations is inconsistent if the rank of the augmented matrix is greater than the rank of the coefficient matrix. If, on the other hand, the ranks of these two matrices are equal, the system must have at least one solution. The solution is unique if and only if the rank equals the number of variables. Otherwise the general solution has k free parameters where k is the difference between the number of variables and the rank; hence in such a case there is an infinitude of solutions. The rank of a system of equations can never be higher than + 1, which means that a system with any number of equations can always be reduced to a system that has a number of independent equations that is at most equal to + 1.