Generated by GPT-5-mini| Random graphs | |
|---|---|
| Name | Random graphs |
| Field | Mathematics |
Random graphs are ensembles of graphs defined by probabilistic rules that produce typical properties through chance. Originating in the twentieth century, they connect foundational work by Paul Erdős, Alfréd Rényi, and contemporaries to modern research in combinatorics, probability, and theoretical computer science. Random graphs form bridges to topics studied at institutions such as the Institute for Advanced Study, Princeton University, and University of Cambridge, and have influenced problems addressed in venues like the International Congress of Mathematicians and journals published by the American Mathematical Society.
Basic models include the binomial or Erdős–Rényi model introduced by Paul Erdős and Alfréd Rényi, the uniform model associated with ensembles in work at University of Chicago and University of Oslo, and extensions such as the configuration model motivated by research at Bell Labs and Bellcore. Other canonical constructions involve stochastic block models developed in collaborations across Bell Labs, AT&T, and academic groups at Stanford University and Massachusetts Institute of Technology. Preferential attachment models, studied by researchers including those at Cornell University and Tel Aviv University, produce power-law degree sequences similar to observations in data from Google, Facebook, and Twitter. Geometric random graphs trace back to investigations by scholars at Princeton University and the University of Illinois in spatial networks, while random intersection graphs were considered in studies at Los Alamos National Laboratory and within cryptographic work linked to National Security Agency collaborations. Each model defined by parameter choices gives rise to ensembles analyzed using methods from groups at University of California, Berkeley, University of Cambridge, and ETH Zurich.
A central theme developed in seminal papers by Paul Erdős and Alfréd Rényi is the emergence of sharp thresholds for monotone properties, a concept formalized in research circles at Rutgers University, University of Chicago, and the Institut des Hautes Études Scientifiques. Classic thresholds include connectivity, giant component appearance, and Hamiltonicity studied in seminars at University of Oxford, Université Paris-Sud, and the Mathematical Sciences Research Institute. Techniques from the Probabilistic Method pioneered by Paul Erdős and collaborators such as work associated with Alfréd Rényi and institutions like Columbia University yield concentration inequalities used in threshold proofs appearing in proceedings of the American Mathematical Society. Sharp concentration results and large deviation principles have been advanced by researchers at Harvard University, Stanford University, and Princeton University, while isoperimetric and expansion conditions tied to thresholds were explored by groups at Microsoft Research and the Institute for Mathematics and its Applications.
Typical features studied by teams at University of Cambridge, Imperial College London, and University of Oxford include degree distributions, component sizes, core-periphery structure, and local clustering. The giant component phenomenon, articulated in early work at Trinity College, Cambridge and formalized in studies at University of Chicago, is linked to branching processes investigated by scholars at University of Illinois Urbana-Champaign and University of California, San Diego. Emergent expander properties relate to constructions by researchers connected to Institute for Advanced Study and error-correcting code studies at Bell Labs and IBM Research. Local weak convergence and Benjamini–Schramm limits were developed in research groups at Hebrew University of Jerusalem and Tel Aviv University, while typical distances and diameter estimates have been refined by teams affiliated with École Normale Supérieure and ETH Zurich. Phenomena such as cores, k-cores, and bootstrap percolation carry signatures analyzed in collaborations involving Los Alamos National Laboratory and departments at University of Toronto.
Quantities like chromatic number, independence number, matching number, and eigenvalue spectra have been the focus of investigations at Princeton University, Columbia University, and Rutgers University. Spectral properties connecting to random matrix theory were pursued jointly by researchers at Institute for Advanced Study, University of California, Berkeley, and New York University's Courant Institute. Limits of dense graph sequences and graphons were introduced and advanced by scholars working with groups at University of Oxford, MIT, and the Jerusalem School of Mathematics. Sparse graph limits and local convergence frameworks have been developed in collaborations tied to Université de Lyon and University of Cambridge. Results on resilience, robustness, and extremal invariants reference classic work from Paul Erdős and follow-up research produced at University of British Columbia and University of Waterloo.
Random graph models inform empirical and theoretical studies in networks analyzed by teams at Google, Facebook, Microsoft Research, and Yahoo! Research. Epidemiological modeling drawing on percolation on random graphs has been applied in studies at Imperial College London, Centers for Disease Control and Prevention, and research collaborations with World Health Organization. Algorithms and complexity results for problems on random instances are central to work at Stanford University, MIT, and industrial labs such as IBM Research, influencing satisfiability and optimization research presented at conferences like STOC and FOCS. Connections to statistical physics, including spin glass models and phase transitions, emerged through interactions between groups at École Polytechnique, Los Alamos National Laboratory, and Santa Fe Institute. Cryptography and anonymity systems use random intersection and expander constructions studied in projects at RSA Security and academic teams at University of California, Los Angeles.