Constrained least squares
In constrained least squares one solves a linear least [squares (mathematics)|linear least squares] problem with an additional constraint on the solution.
This means, the unconstrained equation must be fit as closely as possible while ensuring that some other property of is maintained.
There are often special-purpose algorithms for solving such problems efficiently. Some examples of constraints are given below:
- Equality constrained least squares: the elements of must exactly satisfy .
- Stochastic constrained least squares: the elements of must satisfy, where is a vector of random variables such that and. This effectively imposes a prior distribution for and is therefore equivalent to Bayesian linear regression.
- Regularized least squares: the elements of must satisfy .
- Non-negative least squares : The vector must satisfy the vector inequality defined componentwise—that is, each component must be either positive or zero.
- Box-constrained least squares: The vector must satisfy the vector inequalities, each of which is defined componentwise.
- Integer-constrained least squares: all elements of must be integers.
- Phase-constrained least squares: all elements of must be real numbers, or multiplied by the same complex number of unit modulus.
back into the original expression gives an equation that can be solved as a purely constrained problem in.
where is a projection matrix. Following the constrained estimation of the vector is obtained from the expression above.