What is entropy?

There is no simple relation between entropy and order, and the popular informal account that entropy measures disorder is (as much of popular science) highly inaccurate.

For example, consider English text. The entropy is a characteristic property of the written language, not of a single string. Therefore, no matter how ordered or unordered a string of N characters of an English text is, it has [if this word can be used here at all meaningfully] the same entropy (namely that of a random string of N characters of an English text).

Similarly, the entropy of the ensemble of ''casting a die N times'' is N(log 6)/6, whereas orderliness would be assigned to some particular sequences, for example those sorted by number of points, not to the ensemble. To quantify orderliness, one needs a concept different from entropy, the Kolmogorov complexity.

In general, entropy is a characteristic property of a probability distribution, not of any of its realizations.

In information theory, entropy is a measure of lack of information. It is the expected number of decisions that would be needed to pin down a particular realization of something, when the probability distribution of all possible realizations is given. See Entropy and missing information.

In quantum statistical mechanics (and hence in thermodynamics, which is derived from statistical mechanics), entropy is the expected number of decisions that would be needed to pin down a particular energy eigenstate, given the distribution of energy in the given mixed equilibrium state. This has nothing to do with the problem of whether the given state is or isn't known.


Arnold Neumaier (Arnold.Neumaier@univie.ac.at)
A theoretical physics FAQ