Continuity in probability
In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge.
Definition
Let be a stochastic process in.The process is continuous in probability when converges in probability to whenever converges to.