Least-squares adjustment
Least-squares adjustment is a model for the solution of an overdetermined system of equations based on the principle of least squares of observation residuals. It is used extensively in the disciplines of surveying, geodesy, and photogrammetry—the field of geomatics, collectively.
Formulation
There are three forms of least squares adjustment: parametric, conditional, and combined:- In parametric adjustment, one can find an observation equation relating observations explicitly in terms of parameters .
- In conditional adjustment, there exists a condition equation which is involving only observations — with no parameters at all.
- Finally, in a combined adjustment, both parameters and observations are involved implicitly in a mixed-model equation.
Solution
The equalities above only hold for the estimated parameters and observations, thus. In contrast, measured observations and approximate parameters produce a nonzero misclosure:One can proceed to Taylor series expansion of the equations, which results in the Jacobians or design matrices: the first one,
and the second one,
The linearized model then reads:
where are estimated parameter corrections to the a priori values, and are post-fit observation residuals in statistics|residuals].
In the parametric adjustment, the second design matrix is an identity, B=-I, and the misclosure vector can be interpreted as the pre-fit residuals,, so the system simplifies to:
which is in the form of ordinary least squares.
In the conditional adjustment, the first design matrix is null,.
For the more general cases, Lagrange multipliers are introduced to relate the two Jacobian matrices, and transform the constrained least squares problem into an unconstrained one. In any case, their manipulation leads to the and vectors as well as the respective parameters and observations a posteriori covariance matrices.
Computation
Given the matrices and vectors above, their solution is found via standard least-squares methods; e.g., forming the normal matrix and applying Cholesky decomposition, applying the QR factorization directly to the Jacobian matrix, iterative methods for very large systems, etc.Applications
- Leveling, traverse, and control networks
- Bundle adjustment
- Triangulation, Trilateration, Triangulateration
- GPS/GNSS positioning
- Helmert transformation
Related concepts
- Parametric adjustment is similar to most of regression analysis and coincides with the Gauss–Markov model
- Combined adjustment, also known as the , is related to the errors-in-variables models and total least squares.
- The use of a priori parameter covariance matrix is akin to Tikhonov regularization
Extensions