LLMpediaThe first transparent, open encyclopedia generated by LLMs

Parisi–Wu stochastic quantization

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 93 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted93
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Parisi–Wu stochastic quantization
NameParisi–Wu stochastic quantization
FieldTheoretical physics
Introduced1981
Introduced byGiorgio Parisi, Yongshi Wu

Parisi–Wu stochastic quantization is a method that formulates quantum field theories by introducing a fictitious stochastic time and evolving classical fields with Langevin-type dynamics toward a stationary measure corresponding to the quantum path integral. The approach links ideas from statistical mechanics, stochastic processes, and functional integration and connects practitioners in Quantum field theory, Statistical mechanics, Stochastic processes, Giorgio Parisi, and Yongshi Wu.

Introduction

Parisi–Wu stochastic quantization reinterprets the Feynman path integral in terms of a stochastic evolution akin to the Langevin equation used in Brownian motion studies and draws on techniques developed by researchers associated with Erwin Schrödinger, Paul Langevin, Norbert Wiener, Andrey Kolmogorov, and Kiyoshi Itô. The scheme appealed to theorists working on problems related to Renormalization group, Monte Carlo method, Lattice gauge theory, Kenneth Wilson, and Markov chain Monte Carlo because it suggested alternative algorithms for sampling quantum measures and inspired connections to work by Stanley Grossman, Michael Fisher, J. Zinn-Justin, and Kenneth G. Wilson.

Mathematical formulation

The core formulation introduces a fictitious time parameter tau and evolves a field phi(x,tau) with a stochastic differential equation driven by Gaussian white noise, paralleling constructions used by Kiyoshi Itô and Paul Lévy, and relating to formal manipulations in Functional integration and Gaussian measures studied by Norbert Wiener and Ilya Gelfand. The Langevin equation contains a deterministic drift term given by the variation of the classical action S[phi] and a noise term with correlations fixed to reproduce the Feynman measure in the equilibrium limit, invoking mathematical tools developed by Eugene Wigner, Marcel Riesz, Laurent Schwartz, and Murray Gell-Mann. Correlation functions are obtained as long-time limits of stochastic averages and can be shown to coincide with perturbative expansions familiar from work by Richard Feynman, Julian Schwinger, Sin-Itiro Tomonaga, and developments in Perturbation theory associated with Gerard 't Hooft and Martinus Veltman.

Applications in quantum field theory

Stochastic quantization has been applied to scalar field theories such as phi^4 models historically studied by Kenneth Wilson and Leo Kadanoff, and to fermionic fields with approaches that reference constructions by Paul Dirac, Enrico Fermi, Richard Feynman and lattice implementations inspired by John von Neumann and Leslie Metropolis. It has been used to study nonperturbative aspects of Quantum chromodynamics, with connections to Lattice gauge theory efforts by Wilson, Michael Creutz, and Kenneth G. Wilson and to algorithms in Numerical analysis and Computational physics advanced by Metropolis algorithm contributors like Nicholas Metropolis and A. W. Rosenbluth. In topological field theories and models related to Supersymmetry, practitioners looked to parallels with constructions from Edward Witten, Alexander Polyakov, Nathan Seiberg, and Edward Witten’s work on topological invariants.

Renormalization and convergence

Studies of renormalization within stochastic quantization tie into the broader history of Renormalization group methods developed by Kenneth Wilson, Leo Kadanoff, Michael Fisher, and rigorous treatments by Olivier Darrigol and Gian-Carlo Rota, while analyses of convergence toward equilibrium exploit probabilistic estimates rooted in the work of Kolmogorov, Itô, Paul Lévy, and Kiyoshi Itô. Perturbative renormalization in the stochastic framework reproduces counterterms known from the BPHZ renormalization program associated with Wolfhart Zimmermann and renormalization conditions examined by Gerard 't Hooft and M. Veltman, and nonperturbative control has been sought via constructive methods pioneered by Konrad Osterwalder, Raymond Stora, James Glimm, and Arthur Jaffe.

Stochastic quantization of gauge theories

Gauge theories require special handling of gauge symmetry and constraints, prompting implementations that echo the gauge-fixing techniques of Ludwig Faddeev, Victor Popov, Becchi–Rouet–Stora, and Ilya Tyutin and BRST methods introduced by Claude Becchi, Alessandro Rouet, Raymond Stora, and Igor Tyutin. For nonabelian theories like Quantum chromodynamics the stochastic approach has been adapted to preserve gauge invariance via stochastic gauge fixing, links to the Faddeev–Popov procedure, and comparisons with continuum quantization work by Gerard 't Hooft and lattice practitioners including Michael Creutz. Issues of Gribov copies and global gauge ambiguities relate to investigations by V. N. Gribov, Ian Singer, and subsequent studies in the topology of gauge orbits by Simon Donaldson and Edward Witten.

Extensions and variants

Extensions include complex Langevin methods aimed at addressing sign problems encountered in finite-density Quantum chromodynamics and real-time dynamics, techniques that connect to research by Gert Aarts, Kurt Langfeld, Daniel Sexty, and algorithmic developments in Stochastic differential equations associated with Kiyoshi Itô and numerical schemes inspired by John von Neumann and Alan Turing. Multiscale and colored-noise generalizations draw on stochastic analysis linked to Paul Lévy, while supersymmetric stochastic quantization has been explored in contexts influenced by Edward Witten, Nathan Seiberg, and Alexander Polyakov. Hybrid methods combining stochastic quantization with Monte Carlo and tensor network ideas reference work by Guifre Vidal, John Preskill, and Stephen White.

Historical development and impact

Introduced by Giorgio Parisi and Yongshi Wu in 1981, stochastic quantization influenced subsequent work in Lattice gauge theory, numerical simulation methods championed by Michael Creutz and Kenneth Wilson, and theoretical studies connecting stochastic processes with field theory that involved figures such as Edward Witten, Konrad Osterwalder, Raymond Stora, and Gerard 't Hooft. Its cross-disciplinary nature fostered dialogue among communities represented by institutions including CERN, Princeton University, Institute for Advanced Study, and Sapienza University of Rome, and it remains a conceptual and computational tool in ongoing research on nonperturbative phenomena, sign-problem mitigation, and algorithmic innovation within areas pursued by researchers at MIT, Harvard University, University of Cambridge, and Stanford University.

Category:Quantum field theory