Generated by GPT-5-mini| Poisson process | |
|---|---|
| Name | Poisson process |
| Type | Stochastic process |
| Field | Probability theory |
| Introduced | Siméon Denis Poisson |
| Related | Markov process, Lévy process, Renewal process |
Poisson process is a fundamental stochastic model describing random events occurring independently in time or space. It serves as a canonical example in André-Marie Ampère-era probability theory and underpins methods used by practitioners in Albert Einstein-inspired diffusion studies, Norbert Wiener-related stochastic calculus, and Paul Lévy-style limit theorems. Its mathematical development involved contributions from figures such as Siméon Denis Poisson, Simeon Poisson, William Feller, Andrei Kolmogorov, Aleksandr Khinchin, and Émile Borel.
A Poisson process is defined by stationary independent increments and starts at zero, satisfying an exponential interarrival law whose memoryless property links to John von Neumann-style exponential distributions and Alan Turing-era queueing models. The process has simple point realizations, right-continuous paths, and the lack of aftereffects reminiscent of the Markov property studied by Andrey Markov, Kolmogorov, and Norbert Wiener. Basic properties include superposition and thinning results related to superposition theorems used in Blaise Pascal-inspired combinatorics, and scaling limits connected to invariance principles investigated by Paul Lévy and David Hilbert.
Variants include the homogeneous Poisson process with constant rate linked to homogeneous models analyzed by Émile Picard and nonhomogeneous (inhomogeneous) Poisson processes with time-varying intensity used in studies by André Weil and Émile Borel. Spatial Poisson processes extend to Euclidean settings and are central to stochastic geometry work by Hermann Weyl and Marcel Riesz, while Cox processes (doubly stochastic) were explored by A. R. Cox and later by Patrick Billingsley. Marked Poisson processes connect to point-process literature associated with William Feller and Kai Lai Chung, and compound Poisson processes relate to actuarial mathematics developed by Simeon Denis Poisson-era statisticians and modern researchers like Harold Hotelling.
Constructive routes include the interarrival (renewal) construction tied to renewal theory of William Feller and Alfred L. Porter, the limit of binomial processes as in the law of small numbers credited to Siméon Denis Poisson and elaborated by S. M. Ulam, and characterization via Lévy–Khintchine representations associated with Paul Lévy and Aleksandr Khinchin. Characterizations use memoryless exponential interarrivals connected to Andrey Markov chains, and Palm distributions and Campbell measures studied by Palm" and Niels Henrik Abel-adjacent probabilists; equivalence to Poisson random measure formulations appears in texts by Kai Lai Chung and K. Ito.
Counting distribution at fixed time is Poisson with parameter linked to intensity, a result traced to early combinatorial probability by Siméon Denis Poisson and later formalized by S. N. Bernstein and William Feller. Moments and cumulants grow linearly with time, and factorial moments appear in treatments by R. A. Fisher and Jerzy Neyman. Interarrival times are exponentially distributed, a property connected to memoryless laws studied by Everett and Richard von Mises, while joint distributional facts use inclusion–exclusion principles from Blaise Pascal-era combinatorial analysis and generating-function techniques popularized by G. H. Hardy and J. E. Littlewood.
Applications span physics, engineering, and social sciences: radioactive decay and quantum events in work influenced by Ernest Rutherford and Marie Curie; teletraffic and queueing networks developed in Agner Krarup Erlang's pioneering studies and extended by John Little and David Blackwell; spatial patterns in ecology related to fieldwork by Charles Darwin and Alexander von Humboldt; reliability engineering and risk processes in actuarial studies by Edmund Halley-inspired life tables and William Sealy Gosset-era statistics. Examples include photon arrivals in optics examined by Albert Einstein and Max Planck, arrival of customers in service systems of interest to George B. Dantzig, and occurrence of rare events in epidemiology investigated by John Snow and Florence Nightingale.
Simulation methods use thinning algorithms attributed to developments in Monte Carlo methods by Stanislaw Ulam, inverse transform sampling linked to George Marsaglia and John von Neumann, and acceptance–rejection frameworks associated with Alan Turing and Donald Knuth. Estimation procedures for intensity parameters employ maximum likelihood and Bayesian techniques influenced by Ronald Fisher and Thomas Bayes, with model selection methods connected to criteria by Herman Chernoff and Akaike; nonparametric intensity estimation uses kernel methods developed by Murray Rosenblatt and Bradley Efron.