User:Hippasus/Sandbox

From Wikipedia, the free encyclopedia

In probability theory, the probability distribution of the sum of two independent random variables is the convolution of their individual distributions. Many distributions have well known convolutions. The following is a list of these convolutions. Each statement is of the form

\sum_{i=1}^n X_i \sim Y

where X_1, X_2,..., X_n\, are independent and identically distributed.

Contents

[edit] Discrete Distributions

  • \sum_{i=1}^n \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(n,p)
  • \sum_{i=1}^n \mathrm{Binomial}(n_i,p) \sim \mathrm{Binomail}(\sum_{i=1}^n n_i,p)
  • \sum_{i=1}^n \mathrm{NegativeBinomial}(n_i,p) \sim \mathrm{NegativeBinomail}(\sum_{i=1}^n n_i,p)
  • \sum_{i=1}^n \mathrm{Geometric}(p) \sim \mathrm{NegativeBinomial}(n,p)
  • \sum_{i=1}^n \mathrm{Poisson}(\lambda_i) \sim \mathrm{Poisson}(\sum_{i=1}^n \lambda_i)

[edit] Continuous Distributions

  • \sum_{i=1}^n \mathrm{Normal}(\mu_i,\sigma_i^2) \sim \mathrm{Normal}(\sum_{i=1}^n \mu_i, \sum_{i=1}^n \sigma_i^2)
  • \sum_{i=1}^n \mathrm{Gamma}(\alpha_i,\beta) \sim \mathrm{Gamma}(\sum_{i=1}^n \alpha_i,\beta)
  • \sum_{i=1}^n \mathrm{Exponential}(\theta) \sim \mathrm{Gamma}(n,\theta)
  • \sum_{i=1}^n \chi^2(r_i) \sim \chi^2(\sum_{i=1}^n r_i)
  • \sum_{i=1}^r N^2(0,1) \sim \chi^2_r \,\!
  • \sum_{i=1}^r N^2(0,1) \sim \chi^2_r
  • \sum_{i=1}^n(X_i - \bar X)^2 \sim \sigma^2 \chi^2_{n-1} \qquad \mathrm{where} \quad X_i \sim N(\mu,\sigma^2) \quad \mathrm{and} \quad \bar X = \frac{1}{n} \sum_{i=1}^n X_i \,\!.

[edit] Example Proof

There are various ways to prove the above relations. A straightforward technique is to use the moment generating function, which is unique to a given distribution.

[edit] Proof that \sum_{i=1}^n \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(n,p)

X_i \sim \mathrm{Bernoulli}(p) \quad 0<p<1 \quad 1 \le i \le n
Y=\sum_{i=1}^n X_i
Z \sim \mathrm{Binomial}(n,p) \,\!

The moment generating function of each Xi and of Z is

M_{X_i}(t)=1-p+pe^t \qquad M_Z(t)=(1-p+pe^t)^n

where t is within some neighborhood of zero.

M_Y(t)=E(e^{t\sum_{i=1}^n X_i})=E(\prod_{i=1}^n e^{tX_i})=\prod_{i=1}^n E(e^{tX_i})
=\prod_{i=1}^n (1-p+pe^t)=(1-p+pe^t)^n=M_Z(t)

The expectation of the product is the product of the expectations since each Xi is independent. Since Y and Z have the same moment generating function they must have the same distribution.

[edit] See Also

[edit] References

  • Craig, Allen T.; Robert V. Hogg, Joseph W. McKean (2005). Introduction to Mathematical Statistics, sixth edition, Pearson Prentice Hall. ISBN 0-13-008507-3.