Generated by GPT-5-mini| Parton distribution functions | |
|---|---|
| Name | Parton distribution functions |
| Caption | Schematic of proton internal structure used in high-energy collisions |
| Field | Particle physics |
| Introduced | 1969 |
| Notable people | Richard Feynman, Murray Gell‑Mann, Guido Altarelli, Giorgio Parisi, David Gross |
| Institutions | CERN, Fermilab, DESY, SLAC, LHC |
Parton distribution functions describe the probability densities for finding quarks and gluons carrying a given fraction of a fast-moving hadron's longitudinal momentum. They provide the nonperturbative ingredients that link perturbative calculations in quantum chromodynamics to observables measured at colliders such as the Large Hadron Collider and experiments at Fermilab and DESY. PDFs are extracted from global analyses that combine data from deep inelastic scattering, Drell–Yan processes, and jet production involving collaborations and experiments like HERA, ATLAS, CMS, and Jefferson Lab.
Parton distribution functions originate from the parton model proposed by Richard Feynman and were formalized within quantum chromodynamics by contributions from Murray Gell‑Mann, David Gross, and others; they play a central role in predictions for processes studied at CERN and Fermilab. PDFs encode nonperturbative structure of the proton relevant for experiments at the LHC and SLAC and are indispensable for global analyses performed by groups such as CTEQ, NNPDF, and MSTW. The concept unifies results from fixed‑target experiments like those at CERN SPS and modern collider measurements from ATLAS, CMS, and LHCb.
The theoretical framework for PDFs is provided by quantum chromodynamics and factorization theorems proved in the context of perturbative QCD by Sterman, Collins, and Soper, connecting hard scattering amplitudes computed with techniques developed by Altarelli and Parisi to hadronic cross sections measured by experiments such as HERA and Jefferson Lab. Evolution equations, notably the Dokshitzer–Gribov–Lipatov–Altarelli–Parisi formalism, describe scale dependence and were developed by Dokshitzer, Gribov, Lipatov, Altarelli, and Parisi; they enable comparisons across energy scales relevant to RHIC, Tevatron, and LHC. Operator product expansion methods employed by Wilson and the renormalization group techniques used by 't Hooft and Veltman underpin theoretical control, while soft‑collinear effective theory has been applied by Bauer and Stewart to refine factorization in processes studied at BNL and KEK.
Experimental determination of PDFs relies on a wide corpus of measurements: inclusive deep inelastic scattering from experiments like SLAC, EMC, and HERA; Drell–Yan lepton pair production studied by experiments at Fermilab and CERN; electroweak boson production measured by ATLAS and CMS; and jet measurements from CDF and DØ at the Tevatron. Global fitting collaborations including CTEQ-TEA, NNPDF, JAM, and MMHT combine data from HERA, Jefferson Lab, and RHIC, reconciling constraints from experiments such as COMPASS and NA61/SHINE. Neutrino scattering measurements by NuTeV and MINERvA and heavy‑flavor tagging at LHCb and ALICE provide flavor separation and constraints on strange and charm distributions.
Parametrizations of PDFs are provided by global fitting groups such as CTEQ, CT, NNPDF, MMHT, and HERAPDF, each employing different methodologies and statistical treatments exemplified by the neural‑network approach of NNPDF and the Hessian methods used by CTEQ and MSTW. Fits incorporate theoretical inputs from Alekhin and collaborators and treatments of heavy‑quark schemes by Thorne, Roberts, and Collins, often benchmarked against perturbative results computed with tools developed at CERN and in collaborations with DESY. Public repositories and frameworks like LHAPDF facilitate use by experimental collaborations including ATLAS, CMS, and LHCb.
PDFs are essential for precision predictions of cross sections for processes such as Higgs boson production studied by ATLAS and CMS, top‑quark pair production measured by CDF and DØ, and electroweak precision observables constrained by LEP and SLD. They enter searches for physics beyond the Standard Model pursued by collaborations at the LHC and in interpretations of results from IceCube and Auger. PDFs also impact heavy‑ion programs at RHIC and ALICE and input into Monte Carlo event generators developed by Pythia, Herwig, and Sherpa that are used by experiments at CERN and SLAC.
Quantifying uncertainties in PDFs is crucial for results from ATLAS, CMS, and LHCb; methods include Hessian eigenvector sets advocated by the CTEQ and MSTW groups and Monte Carlo replica techniques used by NNPDF. Higher‑order QCD corrections computed at next‑to‑leading order and next‑to‑next‑to‑leading order by groups working on perturbative calculations reduce theory uncertainty for processes measured at Tevatron and LHC; resummation techniques developed by Sterman and Catani and matching approaches such as POWHEG and MC@NLO are routinely applied by collaborations like CMS. Electroweak corrections relevant for precision measurements at LEP and future colliders are also incorporated into global analyses.
Future developments include improved flavor separation from Jefferson Lab 12 GeV and Electron‑Ion Collider measurements proposed at Brookhaven National Laboratory, enhanced small‑x constraints from the LHeC and forward physics programs at LHCb, and lattice QCD inputs from collaborations such as MILC and RBC/UKQCD to constrain nonperturbative moments. Challenges remain in reconciling tensions among data sets from HERA, ATLAS, and CMS, quantifying nuclear modifications relevant for experiments at JLab and RHIC, and extending precision required for proposed facilities like FCC and CEPC; collaborative efforts involving CERN, DESY, Fermilab, and national laboratories will continue to drive progress.