Generated by GPT-5-mini| Entropy | |
|---|---|
![]() HirokiSayama · CC BY-SA 4.0 · source | |
| Name | Entropy |
| Field | Physics, Chemistry, Information Theory |
| Introduced | 1865 |
| Units | joule per kelvin (J·K−1) |
Entropy
Entropy is a central concept in Thermodynamics, Statistical mechanics, and Information theory, quantifying disorder, multiplicity, and information content across physical and abstract systems. Originating in 19th-century studies of heat engines and evolving through contributions by prominent figures, it now underpins developments in Ludwig Boltzmann's statistical work, Rudolf Clausius's thermodynamic formulations, and Claude Shannon's theory of communication. Entropy connects foundational results in James Clerk Maxwell's kinetic ideas, Josiah Willard Gibbs's ensemble theory, and modern applications spanning Alan Turing-related computation, Richard Feynman's path integrals, and John von Neumann's mathematical physics.
The term was coined by Rudolf Clausius in 1865 during analysis of the Carnot cycle, influenced by earlier work on heat by Sadi Carnot, Émile Clapeyron, and Gustave Coriolis. Later statistical interpretation was advanced by Ludwig Boltzmann and formalized in ensemble nomenclature by Josiah Willard Gibbs, while debates involving James Clerk Maxwell and his thought experiment, the Maxwell's demon paradox, spurred further clarity. Twentieth-century elaborations involved contributions by Erwin Schrödinger in biological contexts, by Leo Szilard linking information and thermodynamics, and by Claude Shannon translating entropy into communication theory, a line continued by Norbert Wiener and Alan Turing in computation and cybernetics.
Entropy admits multiple equivalent formulations across frameworks developed by Rudolf Clausius, Ludwig Boltzmann, Josiah Willard Gibbs, and Claude Shannon. Clausius introduced a macroscopic differential definition in terms of reversible heat and temperature, while Boltzmann provided a microscopic counting expression linking entropy to the number of microstates and featuring Boltzmann's constant, connecting to Gibbs's ensemble averages in canonical, microcanonical, and grand canonical treatments. Shannon defined an informational entropy for probability distributions over message ensembles, paralleling Gibbs's formalism and influencing John von Neumann's recommendation to physicists regarding naming. These formulations interrelate in limits analyzed by Max Planck and in quantum generalizations by John von Neumann via the von Neumann entropy for density operators.
In classical thermodynamics, entropy appears in the Second law of thermodynamics as a state function governing irreversibility in processes such as those described by the Carnot cycle and analyzed in heat-engine efficiency by Sadi Carnot and Nikolay Sadi Carnot's successors. Boltzmann's statistical mechanics connects entropy to microstate counting, underpinning equilibrium ensembles articulated by Josiah Willard Gibbs and kinetic approaches influenced by Ludwig Boltzmann's H-theorem. Quantum statistical mechanics generalizes these ideas using operators developed in the context of Paul Dirac's quantum theory and John von Neumann's mathematical framework, with applications to low-temperature phenomena studied by Lev Landau and Enrico Fermi.
Shannon introduced an entropy measure for discrete probability distributions that quantifies expected information or surprise in communication systems like those analyzed by Claude Shannon at Bell Labs. This measure underlies source coding theorems and channel capacity results formalized with contributions from Richard Hamming, Andrew Viterbi, and David Slepian. Connections to statistical physics were highlighted by Leo Szilard and later by Rolf Landauer who related logical irreversibility to physical dissipation, a conceptual bridge invoked in discussions by Charles Bennett on reversible computing and by Peter Shor in quantum information contexts. Quantum analogues further developed by Alexander Holevo and John Preskill connect von Neumann entropy to entanglement measures used in Michael Nielsen's and Isaac Chuang's quantum computation studies.
Entropy exhibits key mathematical properties such as concavity, subadditivity, and strong subadditivity proved in contexts by Lieb and Ruskai with influence from Elliott Lieb and Mary Beth Ruskai. Inequalities like Gibbs' inequality, the Kullback–Leibler divergence nonnegativity explored by Solomon Kullback and Richard Leibler, and Pinsker-type bounds appear in works connected to Andrey Kolmogorov's probability foundations and Kolmogorov complexity results related to Andrey Kolmogorov and Ray Solomonoff. The asymptotic equipartition property central to Shannon's theory aligns with large-deviation results from Srinivasa Varadhan and Hugo Touchette, while entropy rates and ergodic theorems link to studies by George D. Birkhoff and Wacław Sierpiński-influenced measure theory developments.
Entropy concepts permeate chemistry in reaction spontaneity and equilibrium via Gilbert N. Lewis and Jacobus Henricus van 't Hoff, biology in Erwin Schrödinger's work on life and in molecular thermodynamics studied by Linus Pauling and Peter Debye, and cosmology in discussions by Stephen Hawking about black hole thermodynamics and the Bekenstein-Hawking entropy formula associated with Jacob Bekenstein and Stephen Hawking. In computer science entropy guides compression algorithms like those of Abraham Lempel and Jacob Ziv, machine learning regularization strategies in work by Vladimir Vapnik and Yann LeCun, and network science with entropy rates studied by Duncan J. Watts and Albert-László Barabási. Economics and social sciences use entropy-inspired methods in models developed by Herbert Simon and John Maynard Keynes-influenced statistical investigations, while engineering employs entropy in signal processing, control theory, and materials science researched at institutions like Bell Labs and MIT.
Entropy is measured in units of energy per temperature, commonly joules per kelvin (J·K−1), with Boltzmann's constant providing microscopic scaling as used in statistical mechanics calculations associated with Max Planck's constant and blackbody radiation experiments by Wilhelm Wien and Gustav Kirchhoff. Calorimetry experiments pioneered by Pierre-Simon Laplace and Joseph Black determine entropy changes in chemical reactions catalogued in thermochemical tables refined by Sidney W. Benson and Henry Eyring. Quantum-state tomography and spectroscopy techniques developed in laboratories led by researchers like Serge Haroche and Anton Zeilinger allow experimental access to von Neumann entropy and entanglement measures, while experimental tests of fluctuation theorems relate to work by Gavin Crooks and Christopher Jarzynski.