Generated by GPT-5-mini| Lattice quantum chromodynamics | |
|---|---|
![]() Joel Holdsworth (Joelholdsworth) · Public domain · source | |
| Name | Lattice quantum chromodynamics |
| Field | Quantum chromodynamics, Theoretical physics |
| Developed | 1970s–present |
| Researchers | Kenneth G. Wilson, Michael Creutz, Martin Lüscher, Peter Weisz, Gordon Baym |
Lattice quantum chromodynamics is a non-perturbative, discretized formulation of Quantum chromodynamics used to compute strong-interaction phenomena from first principles. It provides a gauge-invariant regularization of quantum field theory on a spacetime lattice enabling quantitative connections between Hadron properties and the underlying Quark and Gluon dynamics. Key developments were led by figures associated with Cornell University, CERN, Brookhaven National Laboratory, Fermilab, and RIKEN collaborations.
Lattice quantum chromodynamics was pioneered by Kenneth G. Wilson to address confinement in Murray Gell-Mann's Quark model and the non-perturbative regime of Quantum chromodynamics; early numerical work involved Michael Creutz and later formal advances were made by Martin Lüscher and Peter Weisz. Implementations connect to experimental programs at CERN LHC, Jefferson Lab, Fermilab studies of Parton distribution functions and to astrophysical constraints relevant to Neutron star modeling at Max Planck Institute for Astrophysics. Lattice methods underpin precision determinations of CKM matrix elements tested by Belle II, BaBar, LHCb and guide searches for physics beyond the Standard Model explored by ATLAS, CMS, Muon g−2.
The formulation places Quark fields on lattice sites and Gluon variables as link variables on lattice links following Wilson action prescriptions; alternatives include the staggered fermion, Wilson fermion, Clover term, Domain wall fermion, and Overlap fermion discretizations developed to address Chiral symmetry and the Nielsen–Ninomiya theorem. Gauge fixing choices relate to procedures used in Faddeev–Popov method derivations, while renormalization is handled via Symanzik improvement and Perturbation theory matching to schemes like MS-bar. Topological issues engage concepts from Instanton physics and influences on Theta vacuum studies linked to Axion searches; anomalies are connected to the Atiyah–Singer index theorem and axial currents relevant to Adler–Bell–Jackiw anomaly analyses.
Monte Carlo importance sampling through the Metropolis–Hastings algorithm and Hybrid Monte Carlo pioneered by collaborations at Brookhaven National Laboratory and Riken-BNL Research Center is central, with solvers such as Conjugate gradient method, BiCGStab, Multigrid methods and deflation techniques developed by teams at IBM Research, Intel, NVIDIA, and Fujitsu. Smearing schemes like APE smearing and Stout smearing reduce ultraviolet noise; spectral methods from Lanczos algorithm analysis extract hadron spectra akin to techniques used in Large Hadron Collider analyses. Error estimation borrows bootstrap and jackknife methods used at SLAC National Accelerator Laboratory and DESY; algorithmic accelerations employ GPU clusters at National Energy Research Scientific Computing Center and Oak Ridge Leadership Computing Facility.
Lattice computations yield hadron masses in agreement with Particle Data Group averages and determine quantities such as f_pi, B_K, and nucleon structure observables underpinning analyses by J-PARC, COMPASS, and HERMES. Precision inputs for flavor physics constrain CKM matrix elements alongside results from Belle, BaBar, LHCb and inform global fits by groups like CKMfitter and UTfit. Thermodynamic studies map the Quantum chromodynamics phase diagram relevant to RHIC and ALICE investigations of the Quark–gluon plasma and critical point searches associated with Beam Energy Scan programs. Astrophysical inputs inform equations of state used by NICER and LIGO in Neutron star merger interpretations; electroweak matrix elements aid Dark matter and Axion phenomenology reviewed by Particle Data Group.
Systematics include finite lattice spacing effects addressed by Symanzik improvement and continuum extrapolations used by collaborations at MILC, RBC-UKQCD, and HPQCD. Finite-volume corrections reference techniques from Lüscher's formula and resonant scattering treatments akin to methods used in Phase shift analysis at CERN. Chiral extrapolations connect to Chiral perturbation theory employed by groups at Institute for Nuclear Theory and Pisa theory group, while signal-to-noise degradation impacts nucleon and multi-baryon observables studied by NPLQCD and HAL QCD Collaboration. Algorithmic critical slowing down and topological freezing at fine lattice spacings motivate topology-fixing strategies explored at EPFL and University of Edinburgh.
Large-scale computations run on leadership-class systems at Oak Ridge National Laboratory, Argonne National Laboratory, NSCC, Fugaku at RIKEN, and cloud platforms used by Google and Amazon Web Services for opportunistic cycles. Community software includes Chroma, CPS, QUDA, tmLQCD, MILC code, and frameworks developed by collaborations at USQCD and SciDAC. Data management and workflow tools parallel efforts at CERN Open Data and Data Intensive Research in High Energy Physics (DIRAC), with interoperability fostered through standards employed by International Lattice Data Grid initiatives.
Category:Lattice field theory