Generated by GPT-5-mini| Press–Schechter formalism | |
|---|---|
| Name | Press–Schechter formalism |
| Author | William H. Press; Paul Schechter |
| Year | 1974 |
| Field | Cosmology |
Press–Schechter formalism is a theoretical model developed in 1974 to estimate the mass distribution of collapsed structures in the Universe. It provides a statistical prescription to connect initial cosmic microwave background fluctuations with the abundance of bound objects such as galaxy cluster, galaxy, and dark matter halo. The framework influenced subsequent work in large-scale structure of the cosmos, structure formation, and numerical N-body simulation methods.
The Press–Schechter approach emerged amid contemporary efforts by researchers at institutions like California Institute of Technology and Harvard University to explain the origin of galaxy cluster abundance from early-universe conditions revealed by the Cosmic Background Explorer era and antecedent theoretical studies by Peebles, P. J. E. and collaborators. It addressed discrepancies between analytic models and results from early N-body simulation efforts conducted by teams associated with Lawrence Berkeley National Laboratory and Princeton University. Press and Schechter sought a simple analytic expression linking the primordial power spectrum described in the Friedmann–Lemaître–Robertson–Walker metric context and the nonlinear, virialized structures studied by observers of Virgo Cluster and surveys such as those led by Sloan Digital Sky Survey personnel.
The formalism begins with a Gaussian random field characterized by a linear power spectrum often computed using transfer functions from codes developed at Los Alamos National Laboratory and collaborations between groups at Max Planck Institute for Astrophysics and University of Cambridge. It applies the spherical collapse model, a solution related to the Einstein–de Sitter model and analytic treatments by Gunn, J. E. and Gott, J. R., III, to set a critical overdensity threshold. The method smooths the density field with a filter, commonly a top-hat in real space, and computes the fraction of mass residing in regions above threshold using properties of Gaussian statistics derived from work by Kolmogorov and refinements by Bardeen, J. M. and Bond, J. R.. The resulting mass function, n(M), is expressed with a multiplicity function proportional to the derivative of the variance of the smoothed field; constants and the ad hoc factor of two introduced by Press and Schechter connect to the cloud-in-cloud problem discussed later by Peacock, J. A. and Efstathiou, G..
Subsequent enhancements replaced assumptions of spherical collapse with ellipsoidal dynamics informed by studies from Doroshkevich, A. G. and modeling approaches advanced at University of California, Berkeley and University of Chicago. The excursion set formalism by Bond, J. R. and collaborators recast Press–Schechter within a stochastic barrier-crossing framework, enabling comparison with results from the Millennium Simulation and projects run by the Max Planck Society. Empirical calibrations from simulation suites led by groups at Kavli Institute for Cosmological Physics and Simons Foundation introduced fitting functions such as the Sheth–Tormen mass function and parameterizations used by teams associated with European Southern Observatory and National Astronomical Observatory of Japan. Halo occupation distribution models pioneered by researchers at Yale University and Columbia University leveraged these improvements to predict galaxy bias and clustering.
The Press–Schechter formalism underpins predictions for the evolution of galaxy cluster number counts exploited in observational programs like those by Chandra X-ray Observatory, Planck, and Atacama Cosmology Telescope. It informs estimates of the halo mass function used by lensing analyses from collaborations at Hubble Space Telescope teams and spectroscopic surveys coordinated by European Space Agency and National Optical-Infrared Astronomy Research Laboratory. Cosmological parameter inference pipelines developed at institutions such as Kavli Institute and Lawrence Livermore National Laboratory incorporate mass function models to constrain dark matter and dark energy properties, linking theory to data from campaigns by Two Micron All Sky Survey and Dark Energy Survey groups. The framework also guides semi-analytic models of galaxy formation constructed by research groups at Max Planck Institute for Astrophysics and Kavli Institute for Particle Astrophysics and Cosmology.
Critiques emphasize approximations: the assumption of Gaussian initial conditions contrasts with non-Gaussian scenarios explored in works by Komatsu, E. and others, while the spherical collapse threshold ignores tidal shear effects analyzed by White, S. D. M. and Lacey, C.. The ad hoc normalization factor and the cloud-in-cloud ambiguity motivated alternative treatments like the excursion set and peak-background split coined by researchers at University of Oxford and Imperial College London. Comparisons with high-resolution N-body simulation outputs from teams at Argonne National Laboratory and Flatiron Institute reveal systematic deviations at high and low mass ends, prompting empirical corrections by collaborators at University of Tokyo and Institute for Advanced Study. Despite these issues, the formalism remains a foundational analytic tool for linking primordial conditions to observed structure, used alongside sophisticated numerical and observational efforts by institutions such as Carnegie Institution for Science and National Radio Astronomy Observatory.