Probability mass function
From Wikipedia, the free encyclopedia
| This article does not cite any references or sources. (December 2007) Please help improve this article by adding citations to reliable sources. Unverifiable material may be challenged and removed. |
In probability theory, a probability mass function (abbreviated pmf) is a function that gives the probability that a discrete random variable is exactly equal to some value. A pmf differs from a probability density function (abbreviated pdf) in that the values of a pdf, defined only for continuous random variables, are not probabilities as such. Instead, the integral of a pdf over a range of possible values (a, b] gives the probability of the random variable falling within that range.
[edit] Mathematical description
Suppose that X is a discrete random variable, taking values on some countable sample space S ⊆ R. Then the probability mass function fX(x) for X is given by
Note that this explicitly defines fX(x) for all real numbers, including all values in R that X could never take; indeed, it assigns such values a probability of zero.
The discontinuity of probability mass functions reflects the fact that the cumulative distribution function of a discrete random variable is also discontinuous. Where it is differentiable (i.e. where x ∈ R\S) the derivative is zero, just as the probability mass function is zero at all such points.
[edit] Example
Suppose that X is the outcome of a single coin toss, assigning 0 to tails and 1 to heads. The probability that X = x is 0.5 on the state space {0, 1} (this is a Bernoulli random variable), and hence the probability mass function is



