Wold's theorem
In statistics, Wold's decomposition or the Wold representation theorem, named after Herman Wold, says that every covariance-stationary time series can be written as the sum of two time series, one deterministic and one stochastic.
Formally
where:
The moving average coefficients have these properties:
- Stable, that is, square summable <
- Causal
- Minimum phase
- Constant
- It is conventional to define
The usefulness of the Wold Theorem is that it allows the dynamic evolution of a variable to be approximated by a linear model. If the innovations are independent, then the linear model is the only possible representation relating the observed value of to its past evolution. However, when is merely an uncorrelated but not independent sequence, then the linear model exists but it is not the only representation of the dynamic dependence of the series. In this latter case, it is possible that the linear model may not be very useful, and there would be a nonlinear model relating the observed value of to its past evolution. However, in practical time series analysis, it is often the case that only linear predictors are considered, partly on the grounds of simplicity, in which case the Wold decomposition is directly relevant.
The Wold representation depends on an infinite number of parameters, although in practice they usually decay rapidly. The autoregressive model is an alternative that may have only a few coefficients if the corresponding moving average has many. These two models can be combined into an autoregressive-moving average (ARMA) model, or an autoregressive-integrated-moving average model if non-stationarity is involved. See and references there; in addition this paper gives an extension of the Wold Theorem that allows more generality for the moving average accompanied by a sharper characterization of the innovation. This extension allows the possibility of models that are more faithful to physical or astrophysical processes, and in particular can sense ″the arrow of time.″