LLMpediaThe first transparent, open encyclopedia generated by LLMs

stable distribution

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Paul Lévy Hop 3
Expansion Funnel Raw 63 → Dedup 13 → NER 10 → Enqueued 8
1. Extracted63
2. After dedup13 (None)
3. After NER10 (None)
Rejected: 3 (not NE: 3)
4. Enqueued8 (None)
Similarity rejected: 1
stable distribution
NameStable distribution
TypeProbability distribution
ParametersCharacteristic exponent α, skewness β, scale γ, location δ
SupportReal line (or positive reals for one-sided variants)
PropertiesHeavy tails, stability under convolution, infinite variance for α<2

stable distribution

Stable distributions, sometimes called α-stable laws, are a class of probability distributions characterized by a stability property under convolution and by heavy tails that generalize the Gaussian law of Carl Friedrich Gauss and the Central limit theorem settings studied by Paul Lévy and Andrey Nikolaevich Kolmogorov. They were formalized in seminal work by P. Lévy and later developed in the Lévy–Khintchine framework by Aleksandr Khintchine and collaborators; stable laws appear in models connected to the Central limit theorem, the Poisson process, and limit results used by researchers in Louis Bachelier-linked finance and Albert Einstein-inspired physics.

Definition and basic properties

A distribution is stable if a linear combination of two independent identically distributed random variables has the same distribution, up to location and scale, a concept examined by Paul Lévy, Aleksandr Khintchine, and later refined by William Feller and Andrey Kolmogorov in studies that intersected with work at institutions such as Cambridge University and Université Paris-Sud. Stable laws are parameterized by an index of stability α∈(0,2], a skewness parameter β∈[−1,1], a scale γ>0, and a location δ∈ℝ; the case α=2 reduces to the Normal distribution associated with Carl Friedrich Gauss and the Method of least squares. For α<2 the variance is infinite, an observation linked to heavy-tailed behavior examined in analyses by Benoit Mandelbrot and in empirical studies at Bell Laboratories. Stability implies that sums of i.i.d. variables in the domain of attraction yield the same family, a property pivotal in limit theory developed by Andrey Kolmogorov and applied in contexts studied at Princeton University and University of Chicago.

Characteristic function and Lévy–Khintchine representation

Stable distributions are most conveniently described via their characteristic functions, an approach rooted in the Fourier-analytic techniques employed by Lévy and Khintchine and applied in texts by Ronald Fisher and John von Neumann. The characteristic function φ(t) has canonical forms involving α and β and admits a Lévy–Khintchine representation linking the law to an associated Lévy measure, an analytic framework used in work at Institut Henri Poincaré and in stochastic process theory by Kiyosi Itô and Shizuo Kakutani. The representation clarifies infinite divisibility and connections to Lévy processes studied by André Weil and in probabilistic research at University of Cambridge. Parameterization choices (Zolotarev, Nolan, Chambers-Mallows-Stuck) alter the form of φ(t), decisions that appear in applied research at IBM Research and theoretical expositions by Gennady Samorodnitsky.

Special cases and parameterizations

Special cases include α=2 (the Normal distribution), α=1 with β=0 (the Cauchy distribution studied by Augustin-Louis Cauchy), and α=1/2 with β=1 (the Lévy distribution named after Paul Lévy). Multiple parameterizations—such as those by Zolotarev, Nolan, and Chambers-Mallows-Stuck—are used in software developed at places like Statistical Laboratory, University of Cambridge and in packages influenced by research from Ronald Fisher and John Tukey. These parameterizations affect centering and scale conventions encountered in monographs from Springer Science+Business Media and lecture notes at institutions including Massachusetts Institute of Technology and University of Oxford.

Domains of attraction and limit theorems

Stable laws arise as attractors for normalized sums of i.i.d. random variables; classical results by Paul Lévy, Andrey Kolmogorov, and extensions by William Feller and Émile Borel characterize domains of attraction in terms of tail behavior and regular variation, topics also pursued by Jovan Karamata and examined in seminars at École Normale Supérieure. The generalized central limit theorem identifies conditions under which sums converge to an α-stable law, linking to extreme-value and point-process limits studied in work at Courant Institute and University of California, Berkeley. Applications of domain-of-attraction theory appear in statistical mechanics contexts considered by Ludwig Boltzmann and in heavy-tailed modeling discussed by Benoit Mandelbrot.

Simulation and statistical inference

Simulation methods for stable distributions include the Chambers–Mallows–Stuck algorithm, based on work by John Chambers, C. L. Mallows, and Brian Stuck, and variants implemented in computational environments used at Bell Labs and Microsoft Research. Numerical inversion of characteristic functions, maximum likelihood estimation, and fractional lower-order moment methods are tools developed in statistical literature at University College London and Columbia University. Estimation is challenging due to infinite variance for α<2; practical inference strategies were advanced in studies by Peter J. Brockwell and Richard A. Davis and incorporated into software used by practitioners at Goldman Sachs and academic groups at Stanford University.

Applications in physics, finance, and signal processing

Stable distributions model phenomena with large deviations and anomalous diffusion investigated by Albert Einstein-inspired physicists and in turbulence research by Andrey Kolmogorov; they are used in models of Lévy flights and fractional kinetics explored at Max Planck Institute for Physics and Los Alamos National Laboratory. In finance, heavy-tailed returns and price innovations were modeled by Benoit Mandelbrot and later applied in risk management research at J.P. Morgan and regulatory studies referencing Basel Committee on Banking Supervision concerns. In signal processing, impulsive noise modeled by stable laws informs filtering techniques developed by researchers at Massachusetts Institute of Technology and MIT Lincoln Laboratory, with applications in radar and telecommunications studied at Bell Labs and European Organization for Nuclear Research.

Category:Probability distributions