Big O in probability notation


The order in probability notation is used in probability theory and statistical theory in direct parallel to the big O notation that is standard in mathematics. Where the big O notation deals with the convergence of sequences or sets of ordinary numbers, the order in probability notation deals with Convergence of [random variables|convergence of sets of random variables], where convergence is in the sense of convergence in probability.

Definitions

Small ''o'': convergence in probability

For a set of random variables Xn and corresponding set of constants an, the notation
means that the set of values Xn/''an converges to zero in probability as n'' approaches an appropriate limit.
Equivalently, Xn = op can be written as Xn/an = op,
i.e.
for every positive ε.

Big ''O'': stochastic boundedness

The notation
means that the set of values Xn/''an is stochastically bounded. That is, for any ε'' > 0, there exists a finite M > 0 and a finite N > 0 such that

Comparison of the two definitions

The difference between the definitions is subtle. If one uses the definition of the limit, one gets:
  • Big :
  • Small :
The difference lies in the : for stochastic boundedness, it suffices that there exists one to satisfy the inequality, and is allowed to be dependent on . On the other hand, for convergence, the statement has to hold not only for one, but for any . In a sense, this means that the sequence must be bounded, with a bound that gets smaller as the sample size increases.
This suggests that if a sequence is, then it is, i.e. convergence in probability implies stochastic boundedness. But the reverse does not hold.

Example

If is a stochastic sequence such that each element has finite variance, then
If, moreover, is a null sequence for a sequence of real numbers, then converges to zero in probability by Chebyshev's inequality, so