Entropy (general concept)
From Wikipedia, the free encyclopedia
In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the general notion of entropy is shared, the precise definition of this notion has to be adapted to each particular field of study. However, no matter what the field, the definition of the entropy S of a system always takes the basic form:
where Pi is a probability,
and k is a constant. The construction and interpretation of the probabilities varies from field to field. A list of these adaptations are presented below:
In thermodynamics:
- Entropy (classical thermodynamics), the macroscopic approach to thermodynamic entropy
- Entropy (statistical thermodynamics), the microscopic approach to thermodynamic entropy
- Gibbs entropy, statistical entropy of a thermodynamic system
- Boltzmann entropy, an approximation to Gibbs entropy
- Tsallis entropy, a generalization of Boltzmann-Gibbs entropy
- von Neumann entropy, entropy of a quantum-mechanical system
In information and computer science:
- Information entropy (Shannon entropy), a measure of the amount of information contained in a message
- Entropy encoding, data compression strategies
- Rényi entropy, a family of diversity measures used to define fractal dimensions
- Entropy (computing), an indicator of the number of random bits available to seed cryptography systems
In mathematics:
- Kolmogorov–Sinai entropy, the rate of information generation by a measure-preserving dynamical system
- Topological entropy, a measure of exponential growth in the number of distinguishable orbits of a dynamical system
In biology:
- Entropy (ecology), a measure of biodiversity


