Lévy's convergence theorem
From Wikipedia, the free encyclopedia
In probability theory Lévy's convergence theorem (sometimes also called Lévy's dominated convergence theorem) states that for a sequence of random variables
where
and- | Xn | < Y, where Y is some random variable with

it follows that


.
Essentially, it is a sufficient condition for the almost sure convergence to imply L1-convergence. The condition
could be relaxed. Instead, the sequence
should be uniformly integrable.
The theorem is simply a special case of Lebesgue's dominated convergence theorem in measure theory.
[edit] See also
[edit] References
- A.N.Shiryaev (1995). Probability, 2nd Edition, Springer-Verlag, New York, pp.187-188, ISBN 978-0387945491

