Generated by GPT-5-mini| Leggett–Garg inequalities | |
|---|---|
| Name | Leggett–Garg inequalities |
| Field | Quantum foundations |
| Introduced | 1985 |
| Introduced by | Anthony J. Leggett; Anupam Garg |
| Related | Bell inequalities; Kochen–Specker theorem; quantum decoherence; macrorealism |
Leggett–Garg inequalities are theoretical constraints proposed in 1985 to test whether macroscopic systems obey classical realism over time. They contrast classical assumptions about temporal correlations with predictions of quantum mechanics and have driven experimental programs across University of Oxford, Massachusetts Institute of Technology, Harvard University, University of Cambridge, University of California, Berkeley and national laboratories such as National Institute of Standards and Technology, Los Alamos National Laboratory, Rutherford Appleton Laboratory and Bell Labs. The inequalities connect to debates involving figures and institutions like John Bell, Niels Bohr, Erwin Schrödinger, David Bohm, Anton Zeilinger, Alain Aspect, Anton Zeilinger, Steven Weinberg and bodies such as the Royal Society, American Physical Society and Max Planck Society.
Leggett and Garg introduced their inequalities motivated by conceptual disputes exemplified by thought experiments associated with Erwin Schrödinger, Albert Einstein, Boris Podolsky, David Bohm and experimental programs led by Alain Aspect and John Clauser. The framework targets the notion of macrorealism espoused in discussions involving Anthony J. Leggett and debates at institutes like Imperial College London, California Institute of Technology, Stanford University and CERN. It formalizes assumptions analogous to those in results by John Bell and the Bell inequalities while invoking measurement notions discussed in contexts associated with Paul Dirac, Werner Heisenberg, Max Born and the Institute for Advanced Study. Major philosophical interlocutors include Karl Popper, Thomas Kuhn, Hilary Putnam and institutions such as the London School of Economics and Princeton University. The Leggett–Garg approach aims to discriminate classical stochastic descriptions used in technologies by National Aeronautics and Space Administration, European Space Agency and IBM from genuine quantum temporal coherence explored at facilities like European Organization for Nuclear Research.
The inequalities arise by assuming macrorealism per se and noninvasive measurability, building on probabilistic methods developed in works from Andrey Kolmogorov, Richard Feynman, Paul Dirac and John von Neumann. Leggett and Garg derived temporal correlation bounds for a dichotomic observable measured at different times, paralleling mathematical structures used in analyses by Clauser Horne Shimony Holt, John Clauser, Michael Horne and Richard Holt. The derivation employs correlators similar to those in studies by Eugene Wigner, David Mermin, Simon Kochen and Ernst Specker, and uses techniques found in the literature of Roy Glauber, Frank Wilczek and Murray Gell-Mann. Mathematically the bounds relate to convexity results by Hermann Weyl and operator inequalities studied by John von Neumann and Alfred Tarski applied in quantum models developed at Bell Labs and Los Alamos National Laboratory.
Experimental tests span platforms including superconducting qubits at IBM, Google, D-Wave Systems and Yale University, nuclear magnetic resonance at Massachusetts General Hospital and University of Illinois Urbana-Champaign, quantum optics experiments at University of Vienna, University of Oxford and University of Cambridge, and nanomechanical resonators developed at NIST, Caltech and Max Planck Institute for Quantum Optics. Implementations exploit control techniques from groups led by John Martinis, Oskar Painter, Rainer Blatt, Anton Zeilinger and Immanuel Bloch, using weak measurement protocols refined in labs at University of Queensland and University of Science and Technology of China. Notable experimental milestones involve collaborations between institutes like Harvard University and MIT, and measurements performed at facilities such as Lawrence Berkeley National Laboratory and Argonne National Laboratory.
Observed violations in superconducting circuits, photonic systems and spin ensembles have prompted interpretation debates among researchers affiliated with Oxford University, Princeton University, University of Oxford, Perimeter Institute for Theoretical Physics and Institute for Quantum Optics and Quantum Information. Competing readings invoke decoherence frameworks developed by Wojciech Zurek and dynamical collapse proposals associated with Ghirardi Rimini Weber and thinkers connected to University of Trieste and SISSA. Alternative accounts refer to contextuality discussions informed by work from Kochen and Specker, and to nonclassicality measures introduced by groups at University of Waterloo and Centre for Quantum Technologies. Debates also reference philosophical analyses by Tim Maudlin, Adrian Kent, David Albert and panels convened at venues like Perimeter Institute and Foundational Questions Institute.
Extensions generalize the original inequalities into higher-order temporal correlations and multi-time generalizations resembling multipartite bounds studied in the context of the Mermin inequalities and CHSH inequality by researchers at University of Illinois, University of Geneva and École Normale Supérieure. Connections to the Kochen–Specker theorem, Bell inequalities, Tsirelson bound and contextuality measures have been developed in collaborations involving IBM Research, Google Quantum AI, Microsoft Research and universities such as University of Cambridge and ETH Zurich. Generalized temporal steering and resource theories of nonclassicality draw on work by Nicolas Gisin, Vlatko Vedral, Frank Verstraete and groups at University of Amsterdam and University of Vienna.
Probing temporal quantum correlations informs design and certification of devices by Google Quantum AI, IBM Quantum, Rigetti Computing and national labs including NIST and Sandia National Laboratories for quantum computing, sensing and communication. Insights influence error mitigation strategies in architectures developed at Microsoft Quantum, Honeywell, IonQ and PsiQuantum, and inform standards discussed at organizations such as International Organization for Standardization and consortia hosted by National Institutes of Health and European Commission. Potential applications intersect with technologies pursued at Lockheed Martin, Raytheon Technologies and startups incubated through Y Combinator and accelerator programs linked to Silicon Valley and Cambridge Science Park.