Generated by GPT-5-mini| Dokshitzer–Gribov–Lipatov–Altarelli–Parisi | |
|---|---|
| Name | Dokshitzer–Gribov–Lipatov–Altarelli–Parisi |
| Fields | Particle physics, Quantum field theory |
| Notable works | DGLAP equations |
Dokshitzer–Gribov–Lipatov–Altarelli–Parisi is the collective designation for the set of evolution equations developed in the early 1970s that govern scale dependence of parton distribution functions in Quantum Chromodynamics, and which established a cornerstone of perturbative Particle physics. The framework links theoretical results from Leonard Susskind-era Quantum field theory techniques and experimental programs at facilities such as SLAC National Accelerator Laboratory and Large Hadron Collider by providing calculable kernels for parton splitting in processes studied at CERN, Fermilab, and DESY. It unites contributions from researchers associated with institutions like Lebedev Physical Institute, CERN Theory Division, and Sapienza University of Rome and has influenced formalisms used in analysis at collaborations including ATLAS, CMS, H1, and ZEUS.
The equations emerged amid parallel developments by theorists in Moscow, Budapest, and Rome responding to experimental anomalies reported by groups at Stanford Linear Accelerator Center and proposals by researchers at CERN and Brookhaven National Laboratory. Early conceptual antecedents include work by Richard Feynman on parton models, calculations by Kenneth G. Wilson on operator product expansion, and formal renormalization analyses by Gerard 't Hooft and Murray Gell-Mann, while technical maturation drew on methods from Lev Landau-inspired perturbation theory, the Callan–Symanzik equation developed by Curtis Callan and Klaus Symanzik, and anomalous dimension techniques advanced by Alexander Polyakov. The derivations combined insights from asymptotic freedom discovered by David Gross and Frank Wilczek with perturbative computations carried out in gauges used in Steven Weinberg's formulations, and they were disseminated through seminars at institutions like Institute for Advanced Study and publications in journals associated with American Physical Society and Elsevier.
The DGLAP formalism expresses scale evolution through integro-differential equations that relate parton distribution functions measured in experiments at Thomas Jefferson National Accelerator Facility and DESY to calculable kernels derived in perturbation theory by matching calculations performed in schemes such as MS and MS-bar, used by theorists at CERN and Brookhaven National Laboratory. The structure relies on factorization theorems proved in the context of John Collins and Davison Soper analyses, and on renormalization group methods associated with Kenneth Wilson and K. G. Wilson's legacy, enabling reliable extrapolations between energy scales probed at LEP and Tevatron. Operator definitions employ twist expansion techniques related to work by Alexander Balitsky and Ian Balitsky and use moment-space techniques inspired by Mikhail Shifman and Valentin Zakharov.
Splitting functions are central quantities computed order-by-order by teams using techniques developed by Gabriele Veneziano, Miguel Virasoro, and later loop integration strategies by Zvi Bern and David Kosower, with state-of-the-art higher-order results driven by collaborations at DESY, CERN, Brookhaven, and computational groups at SLAC. Calculations of leading-order, next-to-leading-order, and next-to-next-to-leading-order splitting functions built on diagrammatic methods of Richard Feynman and regularization schemes attributed to Gerard 't Hooft and Dimensional regularization practitioners, and required implementation of symbolic algebra systems developed at Princeton University and Massachusetts Institute of Technology. Results are used in global fits by groups such as CTEQ, MSTW, and NNPDF, and they interface with parton shower algorithms by teams behind PYTHIA, HERWIG, and Sherpa.
The equations underpin extractions of parton distribution functions used by experiments at ATLAS, CMS, LHCb, and ALICE and by neutrino programs at IceCube and Super-Kamiokande for interpreting deep inelastic scattering measured at HERA and fixed-target experiments at CERN SPS. They provide the theoretical bridge connecting predictions for cross sections in processes such as Drell–Yan process, Higgs boson production studied at CERN, and jet production investigated by Tevatron experiments, and they are integral to Monte Carlo event generators employed by collaborations like CDF and DØ. Precision electroweak fits produced by groups at LEP and SLAC incorporate DGLAP-evolved inputs when confronting data from Belle and BaBar.
Phenomenological tests combine global fits by consortia like CTEQ, MSTW, and NNPDF with measurements from HERA collaborations H1 and ZEUS, and from ATLAS and CMS inclusive jet and structure-function studies, enabling stringent validation of perturbative calculations up to scales reachable at Large Hadron Collider and beyond. Discrepancies between data and evolution predictions have motivated investigations by researchers at Brookhaven National Laboratory and SLAC into higher-twist corrections, small-x dynamics explored by groups at DESY, and heavy-flavor treatments developed by theorists affiliated with CERN and University of Cambridge. Precision determinations of the strong coupling constant by collaborations such as Particle Data Group rely on DGLAP-based analyses together with lattice results from groups at CERN and Brookhaven.
Extensions include small-x resummation frameworks by Lipatov-associated researchers leading to the Balitsky–Fadin–Kuraev–Lipatov approach and soft-gluon resummation techniques advanced by G. Sterman and George Sterman with applications in threshold studies by Collins and Soper, while matching with transverse-momentum-dependent formalisms used by John Collins and Ted Rogers connects to work by Xiangdong Ji on factorization. Modern developments integrate effective field theory methods from Iain Stewart and Toolkits at MIT and resummation packages maintained by collaborations at CERN and DESY, enabling precision phenomenology for future programs at facilities like High-Luminosity Large Hadron Collider and proposed projects at CERN Future Circular Collider and Electron–Ion Collider.