Generated by GPT-5-mini| many-worlds interpretation | |
|---|---|
| Name | Many-worlds interpretation |
| Introduced | 1957 |
| Proponent | Hugh Everett III |
| Notable works | "Relative State" formulation |
| Influences | Erwin Schrödinger, Max Born, John von Neumann, Niels Bohr |
| Related | Everettian quantum mechanics, decoherence theory |
many-worlds interpretation The many-worlds interpretation is a realist account of quantum mechanics proposing that the universal wave function evolves deterministically under the Schrödinger equation and that apparent stochastic outcomes correspond to branching into multiple, noninteracting worlds. It rejects a physical wave function collapse and situates measurement outcomes within a superposition that includes observers, systems, and environments. Proponents argue this resolves paradoxes associated with measurement problem while critics raise issues about probability and ontology.
The interpretation originated as an alternative to the Copenhagen interpretation and offers a unitary, no-collapse picture closely tied to work by Hugh Everett III, John Wheeler, and later popularizers like Bryce DeWitt. It situates quantum phenomena such as superposition, entanglement, and quantum decoherence in a framework where branches correspond to effectively classical histories related to ideas developed by Wojciech Zurek, Max Tegmark, and David Deutsch. Its ontology appeals to philosophers and physicists interested in the roles of determinism and realism in foundational questions raised by experiments at laboratories like CERN, Los Alamos National Laboratory, and institutions such as Princeton University and University of Cambridge.
The 1957 thesis by Hugh Everett III proposed the "relative state" formulation as a response to formal issues highlighted by John von Neumann in his mathematical treatment and debates with Niels Bohr from the Solvay Conference era. Early reception involved correspondence with figures including Pascual Jordan and skepticism from proponents of Copenhagen interpretation such as Werner Heisenberg and Wolfgang Pauli. The view saw later revival through advocacy by Bryce DeWitt and development of decoherence theory by H. Dieter Zeh and Wojciech Zurek, with institutional discourse occurring at conferences like those held at Los Alamos National Laboratory and Perimeter Institute.
The formal core relies on the linear, unitary evolution given by the Schrödinger equation and on the tensor product structure for composite systems emphasized by John von Neumann. Measurement processes are modeled as entangling interactions that produce branching of the universal wave function into orthogonal components, a process analyzed using reduced density operators and trace operations familiar from work by Max Born and Paul Dirac. Decoherence, as formalized by Wojciech Zurek and mathematically treated in open quantum systems literature by researchers at institutions like MIT and Harvard University, explains preferred bases and effective classicality without invoking collapse. Probability in this framework is often approached via decision-theoretic arguments developed by David Deutsch and David Wallace, or via branch-counting and measure arguments connected to the Born rule originally stated by Max Born.
If accepted, the interpretation carries broad implications for cosmology and quantum gravity work at places like CERN, Caltech, and Perimeter Institute, influencing thinking about the cosmic microwave background, inflationary multiverse scenarios discussed by Alan Guth and Andrei Linde, and proposals in quantum cosmology from researchers like James Hartle and Stephen Hawking. Philosophers such as David Lewis and Hilary Putnam have debated metaphysical ramifications including modality and personal identity, while ethicists consider decision-theoretic consequences relevant to thought experiments discussed by Nick Bostrom and Eliezer Yudkowsky. Technological areas such as quantum computing developed by teams at IBM, Google, and University of California, Berkeley draw on unitary evolution central to this view.
Major criticisms come from proponents of collapse models like Ghirardi–Rimini–Weber theory developed by GianCarlo Ghirardi and Alberto Rimini, and from advocates of hidden-variable theories such as De Broglie–Bohm theory associated with Louis de Broglie and David Bohm. Philosophical objections about probability and ontology have been advanced by figures including Abner Shimony and Roger Penrose, who also proposed gravity-related collapse mechanisms. Alternatives include the Copenhagen interpretation, consistent histories approach by Robert Griffiths, objective collapse proposals discussed at institutions like Perimeter Institute and University of Oxford, and relational interpretations championed by Carlo Rovelli.
Because the interpretation reproduces standard quantum statistics via unitary evolution, it predicts the same empirical outcomes as quantum mechanics in most laboratory tests performed at facilities like Bell Labs, MIT, and Max Planck Institute for Quantum Optics. Proposed tests of alternative collapse models—such as spontaneous localization experiments pursued at INFN and tabletop interferometry at University of Vienna—can, in principle, discriminate between no-collapse and collapse frameworks. Advances in macroscopic superposition experiments by groups at Harvard, University of Oxford, and University of Vienna continue to probe regimes relevant to debates involving Wojciech Zurek's decoherence, while cosmological observations from Planck (spacecraft) and experiments at CERN inform related foundational questions.
Category:Interpretations of quantum mechanics