Khintchine inequality
From Wikipedia, the free encyclopedia
In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick N complex numbers
, and add them together each multiplied by a random sign
, then the expected value of its size, or the size it will be closest to on average, will be not too far off from
.
[edit] Statement of theorem
Let
be i.i.d. random variables with
for every
, i.e., a Rademacher sequence. Let
and let
. Then
for some constants Ap,Bp > 0 depending only on p (see Expected value for notation).
[edit] Uses in analysis
The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let T be a linear operator between two Lp spaces Lp(X,μ) and Lp(Y,ν),
, with bounded norm
, then one can use Khinchine's inequality to show that
for some constant Cp > 0 depending only on p and
.
[edit] References
- Thomas H. Wolff, "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN 0-8218-3449-5



