Marginal distribution
From Wikipedia, the free encyclopedia
| Please help improve this article or section by expanding it. Further information might be found on the talk page or at requests for expansion. (April 2008) |
In probability theory, given two jointly distributed random variables X and Y, the marginal distribution of X is simply the probability distribution of X ignoring information about Y, typically calculated by summing or integrating the joint probability distribution over Y.
For discrete random variables, the marginal probability mass function can be written as Pr(X = x). This is
where Pr(X = x,Y = y) is the joint distribution of X and Y, while Pr(X = x|Y = y) is the conditional distribution of X given Y.
Similarly for continuous random variables, the marginal probability density function can be written as pX(x). This is
where pX,Y(x,y) gives the joint distribution of X and Y, while pX|Y(x|y) gives the conditional distribution for X given Y.



