LLMpediaThe first transparent, open encyclopedia generated by LLMs

lattice gauge theory

Generated by DeepSeek V3.2
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 61 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted61
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
lattice gauge theory
FieldQuantum field theory, Particle physics, Condensed matter physics
RelatedQuantum chromodynamics, Monte Carlo method, Wilson loop

lattice gauge theory is a non-perturbative framework for studying quantum field theory, particularly Yang–Mills theory, by discretizing spacetime onto a finite lattice. It provides a mathematically well-defined regularization scheme, allowing for first-principles calculations of strongly interacting systems like quantum chromodynamics. The approach was pioneered in the 1970s by Kenneth G. Wilson, who introduced the concept of the Wilson action. This formulation has become the primary tool for numerical computations of hadron masses and the quark–gluon plasma from the underlying Standard Model.

Introduction

The development of this framework was driven by the need to understand the non-perturbative regime of quantum chromodynamics, where the strong interaction prevents the use of standard Feynman diagram expansions. By replacing continuous spacetime with a discrete grid of points, the path integral formulation of a gauge theory becomes a finite-dimensional statistical system. This discretization, formalized by Kenneth G. Wilson, tames the ultraviolet divergence inherent in continuum quantum field theory. The approach connects fundamental particle physics to techniques in statistical mechanics, enabling the study of confinement and chiral symmetry breaking through numerical simulation.

Formulation

The core idea replaces the continuum gauge field with group elements, known as link variables, assigned to the edges between neighboring lattice sites. The dynamics are governed by a discretized action, most commonly the Wilson action, which is constructed from the smallest closed loops on the lattice, called plaquettes. This formulation exactly preserves local gauge invariance at the discrete level. Fermionic fields, representing quarks, are placed on the lattice sites, but their treatment introduces technical challenges like fermion doubling. The continuum limit is recovered by taking the lattice spacing to zero while tuning parameters according to the renormalization group.

Computational methods

Numerical evaluation of the discretized path integral is performed almost exclusively using Markov chain Monte Carlo methods, such as the Metropolis–Hastings algorithm or the Hybrid Monte Carlo algorithm. These algorithms generate an ensemble of gauge field configurations distributed according to the Boltzmann distribution from the lattice action. Physical observables, like the pion mass or the proton mass, are then calculated as statistical averages over these configurations. Major international collaborations, including the MILC Collaboration and the RBC–UKQCD collaboration, utilize powerful supercomputers like those at Jülich Supercomputing Centre and the Texas Advanced Computing Center for these computationally intensive simulations.

Physical results and applications

The most significant achievement has been the ab initio determination of the hadron spectrum from quantum chromodynamics, providing crucial validation of the Standard Model. Calculations have precisely determined parameters like the strong coupling constant and fundamental quantities such as the nucleon mass. The framework has been essential for studying the properties of the quark–gluon plasma and the QCD phase diagram, with relevance to experiments at the Relativistic Heavy Ion Collider and the Large Hadron Collider. It also provides theoretical input for precision tests in flavor physics, such as kaon mixing parameters needed by experiments like LHCb and Belle II.

Extensions include formulations with improved actions, such as the Symanzik improvement program, to reduce discretization errors. The study of theories at finite temperature and density is crucial for understanding neutron star interiors and the early universe, explored via the imaginary time formalism. Related Hamiltonian formulations, like quantum link models, connect to efforts in quantum simulation using ultracold atoms in optical lattices. Furthermore, the framework provides a testing ground for ideas in quantum gravity through the study of lower-dimensional models and connections to spin foam approaches in loop quantum gravity.

Category:Quantum field theory Category:Computational physics Category:Particle physics