Generated by GPT-5-mini| NNPDF | |
|---|---|
| Name | NNPDF |
| Formation | 2001 |
| Type | Research collaboration |
| Headquarters | Geneva |
| Fields | Particle physics, Quantum chromodynamics |
NNPDF The NNPDF collaboration produces sets of parton distribution functions used in high-energy physics phenomenology. The project develops statistically rigorous determinations of quark and gluon momentum distributions inside hadrons, combining theoretical calculations from Quantum chromodynamics with experimental measurements from facilities such as the Large Hadron Collider and the Tevatron. NNPDF outputs are widely used by collaborations including ATLAS, CMS, LHCb, ALICE, and experiments at the HERA accelerator for precision tests of the Standard Model and searches for physics beyond the Standard Model like Supersymmetry, Extra dimensions, and Dark matter signatures.
NNPDF was founded by researchers associated with institutions such as the University of Oxford, Universidad de Granada, CERN, and Vrije Universiteit Brussel to address systematic uncertainties in extracting parton distributions. The collaboration emphasizes neural-network parameterizations trained on Monte Carlo replicas of datasets drawn from experiments including H1, ZEUS, CDF, D0, ATLAS, CMS, and LHCb. The resulting ensembles of fits provide uncertainty estimates that are used by theory groups like the PDF4LHC working group, the CTEQ collaboration, and the MMHT group for combined studies and global analyses involving inputs from the Particle Data Group and theoretical computations such as next-to-leading order and next-to-next-to-leading order perturbative calculations by authors connected to Vladimir Gribov-style evolution and the DGLAP formalism.
NNPDF employs feed-forward neural networks as flexible interpolants to represent parton distributions at an input scale, optimized using stochastic minimization techniques inspired by work from machine learning groups at Google DeepMind and OpenAI while retaining physics constraints from sum rules and perturbative evolution. The fitting procedure uses Monte Carlo replica methods to propagate statistical and systematic uncertainties from experiments like ZEUS and H1 and to avoid bias associated with fixed functional forms used historically by groups such as CTEQ and MSTW. Evolution of distributions is performed using the DGLAP equations, with factorization and renormalization scale choices benchmarked against calculations by teams at SLAC National Accelerator Laboratory, Fermilab, and DESY. Regularization strategies borrow from cross-validation techniques developed in the context of pattern recognition at Massachusetts Institute of Technology and Stanford University to prevent overfitting to fluctuations in datasets like deep-inelastic scattering, Drell–Yan, and jet production from experiments such as UA1 and UA2.
The input dataset for NNPDF fits spans measurements from lepton–proton scattering at HERA experiments (H1 and ZEUS), fixed-target deep-inelastic scattering from collaborations at SLAC, NMC, BCDMS, and EMC, Drell–Yan and vector boson production from NA51, E605, CDF, and D0, and LHC measurements from ATLAS, CMS, LHCb, and ALICE. Jet production inputs include data analyzed by groups at Tevatron and LHC experiments with cross sections computed using codes developed at institutions like NIKHEF and IN2P3. Heavy-flavor treatments reference pole-mass and MSbar schemes used in calculations by researchers at Brookhaven National Laboratory and Institute for Nuclear Research (INR). Electroweak inputs and radiative corrections follow prescriptions used by authors connected to the LEP and SLC programs.
NNPDF sets provide predictions for cross sections relevant to precision tests and new-physics searches, including Higgs boson production via gluon fusion, top-quark pair production, and high-mass dilepton spectra. These predictions are compared with state-of-the-art perturbative results from groups involved with NNLOJET, MCFM, FEWZ, and resummation frameworks by researchers affiliated with Padova and Bologna. Uncertainty bands from NNPDF are propagated into phenomenological studies of processes studied by ATLAS and CMS to quantify theoretical errors in measurements of the Higgs boson couplings, determinations of the W boson mass, and constraints on parton luminosities relevant for searches for Z' bosons and Contact interactions. NNPDF results have influenced global combinations coordinated by the PDF4LHC community and have informed proposals for future facilities such as the Future Circular Collider and the Electron-Ion Collider.
NNPDF releases are distributed as sets compatible with the LHAPDF interface, enabling use by Monte Carlo event generators including PYTHIA, HERWIG, SHERPA, and parton-level programs like MadGraph and Powheg. The collaboration maintains codes for training neural networks, replica generation, and DGLAP evolution, developed with contributions from groups at CERN, Universidad de Zaragoza, University of Milan, and University of Edinburgh. Releases are versioned and numbered, and specific sets include variations treating heavy quarks, strong coupling alpha_s, and QED corrections, following conventions used by international working groups such as Les Houches workshops.
NNPDF validations compare fits to alternative determinations by groups including CTEQ, MMHT, and analyses from the ABMP collaboration, using benchmark processes like deep-inelastic structure functions, vector-boson rapidity distributions, and inclusive jet spectra measured by ATLAS and CMS. Closure tests, reweighting procedures, and statistical diagnostics draw on techniques from collaborations at LHCb and the Particle Physics Data Group to ensure robustness against dataset shifts and methodological choices. These comparisons have been discussed at venues such as the Rencontres de Moriond, the International Conference on High Energy Physics, and PDF4LHC workshops, guiding improvements in the treatment of correlated systematics and theoretical uncertainties.