Entropy (general concept)

From Wikipedia, the free encyclopedia

In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the general notion of entropy is shared, the precise definition of this notion has to be adapted to each particular field of study. However, no matter what the field, the definition of the entropy S of a system always takes the basic form:

S=-k\sum_i P_i \ln P_i\,

where Pi is a probability, 0\le P_i\le 1 and k is a constant. The construction and interpretation of the probabilities varies from field to field. A list of these adaptations are presented below:

In thermodynamics:

In information and computer science:

  • Entropy (computing), an indicator of the number of random bits available to seed cryptography systems

In mathematics:

In biology: