Generated by GPT-5-mini| epsilon expansion | |
|---|---|
| Name | Epsilon expansion |
| Field | Theoretical physics, Statistical mechanics, Mathematical physics |
| Introduced | 1970s |
| Introduced by | Kenneth G. Wilson, Michael E. Fisher |
| Key people | Kenneth G. Wilson, Michael E. Fisher, Gerard 't Hooft, David J. Gross, Alexander Zamolodchikov |
| Related concepts | Renormalization group, Critical phenomena, Perturbation theory, Dimensional regularization |
epsilon expansion
The epsilon expansion is a perturbative technique that analyzes critical behavior near upper or lower critical dimensions by expanding physical quantities in a small parameter often denoted by epsilon. Developed in the context of the Renormalization group and Critical phenomena, it connects results from Mean field theory-like limits to real-world dimensions via controlled series in epsilon. The method has been applied across Quantum field theory, Statistical mechanics, and related areas influenced by researchers associated with Cornell University, Princeton University, and University of Chicago.
The method emerged in the early 1970s through the work of Kenneth G. Wilson and Michael E. Fisher who synthesized ideas from Leo Kadanoff's block-spin picture and earlier perturbative approaches associated with Julian Schwinger and Richard Feynman. Influences include techniques from Dimensional regularization developed by Gerard 't Hooft and Martinus Veltman, and conceptual framing by the Renormalization group programs of Kenneth G. Wilson and the analytic contributions of Leo Kadanoff. Subsequent elaborations involved connections to exact results in two dimensions by Alexander Zamolodchikov and to asymptotic series analysis by G. H. Hardy and Freeman Dyson.
In its canonical formulation one studies a Landau–Ginzburg–Wilson-type action near a critical dimension d_c and sets d = d_c - epsilon or d = d_c + epsilon depending on context, importing techniques from Dimensional regularization and Minimal subtraction scheme pioneered in perturbative Quantum field theory by Gerard 't Hooft and Thomas Appelquist. The expansion treats coupling constants as functions of epsilon and employs the beta function computed order-by-order in perturbation theory using diagrams introduced by Richard Feynman and organized via counterterms inspired by Kenneth G. Wilson's renormalization approach. Fixed points of the resulting renormalization-group flow, computed as series in epsilon, yield critical exponents that can be compared with results from Monte Carlo method studies at institutions like Argonne National Laboratory and analytical constraints from Conformal field theory.
Practitioners applied the expansion to the O(N) model to compute anomalous dimensions and crossover phenomena relevant to experiments at CERN and condensed-matter measurements at places like Bell Labs. It underpins predictions for universality classes observed in systems ranging from binary liquid mixtures studied at Brookhaven National Laboratory to magnetism experiments linked to work at Oak Ridge National Laboratory. In Quantum chromodynamics contexts, epsilon-like expansions inform perturbative analyses related to running couplings first systematized by David J. Gross and Frank Wilczek, while in two-dimensional systems comparisons are made with exact results from Alexander Zamolodchikov and lattice studies influenced by Kenneth G. Wilson's lattice gauge theory program.
Computational techniques include diagrammatic perturbation using Feynman diagram enumeration, algebraic renormalization methods developed in the tradition of Gerard 't Hooft and Wolfgang Pauli, and high-order series generation using symbolic manipulation software inspired by projects at Princeton University and Massachusetts Institute of Technology. Variants include expansions about lower critical dimensions, multi-parameter expansions for anisotropic systems explored by researchers at Stanford University, and functional approaches such as the nonperturbative Functional renormalization group advocated by groups at École normale supérieure and Max Planck Society institutions. Resurgent analysis techniques influenced by work at IHÉS and Cambridge University have also been applied to extract physical information from divergent series.
Series obtained are typically asymptotic rather than convergent, a feature discussed in analyses by G. H. Hardy and exemplified in perturbative studies by Freeman Dyson. To extract accurate numerical predictions practitioners use resummation methods such as Borel resummation, conformal mapping, and Padé approximants developed in numerical analysis communities at Argonne National Laboratory and Los Alamos National Laboratory. Limitations arise when epsilon is not small (for example epsilon = 1 when extrapolating from four to three dimensions), prompting cross-checks with nonperturbative lattice computations by groups at CERN, numerical bootstrap constraints from teams at Princeton University, and exact two-dimensional results from Conformal field theory experts.
Notable successes include the computation of critical exponents for the three-dimensional Ising model via expansion about four dimensions, results for the O(N) model matching experiments on superfluid helium at NIST, and high-order loop calculations that extend to five and six loops performed by collaborations connected with École Normale Supérieure and University of Paris. Landmark papers by Kenneth G. Wilson, Michael E. Fisher, and later high-order work influenced by Gerard 't Hooft and David J. Gross remain central to the literature, while comparisons with lattice Monte Carlo results from groups at Brookhaven National Laboratory and conformal bootstrap bounds from Rutgers University continue to refine confidence in specific extrapolations.