Entropy rate
From Wikipedia, the free encyclopedia
The entropy rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:
when the limit exists. An alternative, related quantity is:
For strongly stationary stochastic processes, H(X) = H'(X).
[edit] Entropy Rates for Markov Chains
Since a stochastic process defined by a Markov chain which is irreducible and aperiodic has a stationary distribution, the entropy rate is independent of the initial distribution.
For example, for such a Markov chain Yk defined on a countable number of states, given the transition matrix Pij, H(Y) is given by:
| H(Y) = − | ∑ | μiPijlogPij |
| ij |
where μi is the stationary distribution of the chain.
A simple consequence of this definition is that the entropy rate of an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.
[edit] References
- Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0471062596 [1]
[edit] External links
- Systems Analysis, Modelling and Prediction (SAMP), University of Oxford MATLAB code for estimating information-theoretic quantities for stochastic processes.



