Big O in probability notation
The order in probability notation is used in probability theory and statistical theory in direct parallel to the big O notation that is standard in mathematics. Where the big O notation deals with the convergence of sequences or sets of ordinary numbers, the order in probability notation deals with Convergence of [random variables|convergence of sets of random variables], where convergence is in the sense of convergence in probability.
Definitions
Small ''o'': convergence in probability
For a set of random variables Xn and corresponding set of constants an, the notationmeans that the set of values Xn/''an converges to zero in probability as n'' approaches an appropriate limit.
Equivalently, Xn = op can be written as Xn/an = op,
i.e.
for every positive ε.
Big ''O'': stochastic boundedness
The notationmeans that the set of values Xn/''an is stochastically bounded. That is, for any ε'' > 0, there exists a finite M > 0 and a finite N > 0 such that
Comparison of the two definitions
The difference between the definitions is subtle. If one uses the definition of the limit, one gets:- Big :
- Small :
This suggests that if a sequence is, then it is, i.e. convergence in probability implies stochastic boundedness. But the reverse does not hold.
Example
If is a stochastic sequence such that each element has finite variance, thenIf, moreover, is a null sequence for a sequence of real numbers, then converges to zero in probability by Chebyshev's inequality, so