Generated by GPT-5-mini| HiggsTools | |
|---|---|
| Name | HiggsTools |
| Title | HiggsTools |
| Developer | CERN, DESY, Fermilab |
| Released | 2012 |
| Programming language | C++, Python, Fortran |
| Operating system | Linux, macOS |
| License | GPL, BSD |
HiggsTools is a software and methodological framework developed for precision studies of the Higgs sector in high-energy physics. It provides analysis modules, statistical tools, and theoretical calculators that interface with experimental datasets from major collaborations and computational tools from lattice, perturbative, and effective-field approaches. The project brings together people and institutions involved in collider physics, computational phenomenology, and data preservation.
HiggsTools arose from collaborative efforts following the Large Hadron Collider programme and the discovery announced by the ATLAS and CMS collaborations, responding to needs articulated at workshops such as those at CERN and DESY. Early development drew on experience from projects including Rivet, ROOT, and MadGraph and was influenced by discussions at the Les Houches workshops and the ECFA study groups. Funding and institutional support came from agencies like the European Research Council, the U.S. Department of Energy, and national laboratories including Fermilab and SLAC National Accelerator Laboratory. Over successive releases the codebase integrated modules inspired by tools such as HDECAY, Prophecy4f, and FeynHiggs, and the governance model adopted practices from collaborations like ATLAS and CMS for validation and release management.
The primary purpose is to enable precision interpretation of Higgs-related measurements from facilities like the Large Hadron Collider, future proposals such as the International Linear Collider, and conceptual studies for the Future Circular Collider. The scope covers parameter extraction, signal-strength fits, effective-field-theory analyses tied to frameworks like SMEFT and HEFT, and theoretical predictions incorporating inputs from perturbative QCD, electroweak calculations, and lattice results from groups such as those at CERN Theory and RIKEN. HiggsTools also targets interfaces to experimental likelihoods released by collaborations including ATLAS, CMS, and LHCb and to global-fit initiatives like those led by Gfitter and the Higgs Cross Section Working Group.
HiggsTools implements likelihood construction, nuisance-parameter profiling, and Bayesian posteriors using algorithms comparable to those in BAT, HEPFit, and HistFactory. It supports matrix-element reweighting akin to Powheg and aMC@NLO approaches and incorporates resummation and matching techniques referenced by studies from NNPDF and perturbative calculations associated with CTEQ and MSTW. Key features include modular parsers for published results by ATLAS and CMS, effective-operator bases motivated by literature from Weinberg, Buchmuller, and Grzadkowski, and tools for combining inputs from precision observables such as those measured at LEP and Tevatron. The framework provides uncertainty propagation compatible with procedures used by the Particle Data Group and accommodates PDF sets from LHAPDF exchanges.
Practically, HiggsTools is used to extract coupling modifiers compared against Standard Model predictions tested by experiments including ATLAS, CMS, CDF Collaboration, and D0 Collaboration. It supports reinterpretation of searches for extended sectors postulated in models from authors like Glashow, Weinberg, and Higgs by enabling fits to two-Higgs-doublet models, singlet extensions, and simplified models popular in the Dark Matter community. The software underpins phenomenological studies presented at conferences such as ICHEP and Moriond and informs roadmap documents prepared by panels like the European Strategy for Particle Physics and working groups of the Snowmass process.
The codebase is implemented in a hybrid C++/Python architecture with numerical backends in Fortran for legacy routines, following practices from projects such as Geant4 and ROOT. Interfaces exist to external libraries including LHAPDF, HEPMC, and statistical packages like RooFit and Minuit2. Distribution channels follow packaging traditions used by CMake and Conda ecosystems and continuous integration mirrors choices made by repositories hosted with services similar to GitHub and GitLab. Licensing mixes GPL and BSD-compatible components reflecting contributions from academia and national laboratories including CERN and Fermilab.
Validation strategies mirror those applied in collaboration analyses from ATLAS and CMS with blinded comparisons, round-robin benchmarks, and regression tests using datasets derived from Rivet routines and public likelihoods from the HEPData repository. Performance profiling adopts tools and metrics from Valgrind and gprof and emphasizes reproducibility as championed by initiatives like REANA and ROCrate. Timings and scalability have been evaluated on computing resources such as CERN OpenStack and national tiered infrastructures like the Worldwide LHC Computing Grid.
The project is stewarded by an international consortium of institutions including CERN, DESY, Fermilab, universities such as Oxford University, MIT, and Université Paris-Saclay, and draws contributions from working groups at ICHEP, Les Houches, and EPS-HEP. Outreach and training occur through tutorials at summer schools organized by SUSY Days-style events and workshops hosted by laboratories like SLAC and TRIUMF. Governance uses a collaboration board model similar to those of ATLAS and CMS and emphasizes open-science practices promoted by funders including the European Commission.
Category:Particle physics software