McDiarmid's inequality
From Wikipedia, the free encyclopedia
McDiarmid's inequality, named after Colin McDiarmid, is a result in probability theory that gives an upper bound on the probability for the value of a function depending on multiple independent random variables to deviate from its expected value. It is a generalization of Hoeffding's inequality.
Let
be independent random variables taking values in a set A, and assume that
is a function satisfying

(In other words, replacing the i-th coordinate xi by some other value changes the value of f by at most ci.) Then for any
,
![\Pr \left\{ f(X_1, X_2, \dots, X_n) - E[f(X_1, X_2, \dots, X_n)] \ge \varepsilon \right\}
\le
\exp \left( - \frac{2 \varepsilon^2}{\sum_{i=1}^n c_i^2} \right) \; .](../../../../math/5/e/e/5eef48eb9411029bbbed14ab037a7ce2.png)
The Hoeffding's inequality is obtained, as a special case, by applying McDiarmid's inequality to the function
.

