Generated by GPT-5-mini| Path integral formulation | |
|---|---|
| Name | Path integral formulation |
| Field | Theoretical physics |
| Introduced | 1948 |
| Introduced by | Richard Feynman |
| Related | Quantum mechanics, Quantum field theory, Statistical mechanics |
Path integral formulation The path integral formulation presents quantum amplitudes as sums over histories, recasting dynamics in terms of integrals over trajectories rather than operators. Conceived by Richard Feynman and influenced by Paul Dirac and earlier ideas in Erwin Schrödinger's wave mechanics and Norbert Wiener's stochastic processes, it connects to Statistical mechanics and underpins developments in Quantum electrodynamics, Quantum chromodynamics, and modern Condensed matter physics.
The formulation emerged when Richard Feynman sought alternatives to the Heisenberg picture and the Schrödinger equation during work at Princeton University and Los Alamos National Laboratory, drawing on correspondence with John Wheeler and insights from Paul Dirac. It reframes transition amplitudes using a functional integral over classical paths, offering intuitive links to the Principle of least action, the Hamiltonian formalism, and the Lagrangian formalism. Early applications by Sin-Itiro Tomonaga, Julian Schwinger, and Freeman Dyson helped incorporate the method into Quantum electrodynamics and later into the renormalization program developed by Gerard 't Hooft and Martinus Veltman.
The core expression represents the propagator as an integral over all trajectories weighted by exp(iS/ħ), where S is the classical action derived from a Lagrangian such as one used in Lagrangian mechanics. Rigorous treatment uses measure theory and techniques from Functional analysis, invoking constructs related to the Wiener measure and the theory of distributions. The formulation connects to the Stationary phase approximation, the Saddle-point method, and semiclassical expansions associated with Gutzwiller trace formula techniques used in Quantum chaos. Regularization and renormalization require methods developed in the context of the Renormalization group by figures like Kenneth Wilson and Leo Kadanoff. Mathematical foundations benefited from work by Edward Nelson, Klaus Hepp, and Barry Simon on constructive approaches and from advances in Algebraic topology and Differential geometry when applying path integrals to systems with nontrivial configuration spaces such as Gauge theory moduli spaces.
In nonrelativistic settings, path integrals compute transition amplitudes for systems like the harmonic oscillator and the hydrogen atom, complementing operator methods used by Werner Heisenberg and Paul Dirac. Semiclassical approximations recover results from the WKB approximation and clarify interference phenomena in setups related to the Aharonov–Bohm effect and the Double-slit experiment. Time-dependent perturbation theory and scattering amplitudes in simple potentials employ expansions reminiscent of techniques used by Enrico Fermi and Lev Landau. The approach also informs treatments of spin using coherent-state path integrals developed in work related to Wolfgang Pauli's spin theory and to spin systems studied in Solid state physics by researchers such as Philip Anderson.
Functional integrals generalize to fields, yielding generating functionals for correlation functions central to Quantum field theory and to the perturbative expansion organized by Feynman diagrams. Applications include Quantum electrodynamics, Electroweak theory, and Quantum chromodynamics, which were shaped by contributions from Gerard 't Hooft, Frank Wilczek, David Gross, and Steven Weinberg. Nonperturbative techniques include instanton calculus introduced by Alexander Belavin and Gerard 't Hooft's work on gauge theories, and soliton and monopole studies linked to Nikolay Bogolyubov and Paul Dirac's magnetic monopole notion. Path integrals on curved spacetimes bridge to General relativity research by figures like Stephen Hawking and Roger Penrose and to modern approaches in String theory developed by Edward Witten, Juan Maldacena, and Michael Green.
Practical evaluation uses perturbation theory with Feynman rules, diagrammatic resummations, and renormalization techniques from the work of Julian Schwinger and Sin-Itiro Tomonaga. Lattice discretizations following Kenneth Wilson enable numerical Monte Carlo simulations applied to Lattice gauge theory problems in Quantum chromodynamics and condensed matter models studied by Mohit Randeria and Nandini Trivedi. Semiclassical methods exploit instantons and soliton saddle points as in work by Gabriele Veneziano and Sidney Coleman, while stochastic quantization and worldline methods draw on ideas from Parisi–Wu stochastic quantization and the Faddeev–Popov procedure for handling gauge redundancies introduced by Ludvig Faddeev and Victor Popov. Computational advances also incorporate algorithms influenced by Peter Shor's quantum algorithms research and numerical linear algebra techniques associated with Iain Stewart and Gene H. Golub.
The sum-over-histories perspective raises foundational questions intersecting with debates involving Niels Bohr, Albert Einstein, and John Bell about locality, realism, and measurement, including implications for the EPR paradox and Bell inequalities explored experimentally by teams such as those of Alain Aspect. The formalism offers pathways to decoherence analyses advanced by Wojciech Zurek and to quantum cosmology proposals like the Hartle–Hawking "no-boundary" model influenced by James Hartle and Stephen Hawking. Interpretations range from operational pragmatism used in particle physics communities at institutions like CERN and SLAC National Accelerator Laboratory to research programs connecting to the many-worlds view associated with Hugh Everett III and to relational perspectives considered by Carlo Rovelli.