Generated by GPT-5-mini| Geometric distribution | |
|---|---|
![]() | |
| Name | Geometric distribution |
| Type | Discrete probability distribution |
| Support | {1,2,3,...} or {0,1,2,...} |
| Parameters | Probability of success p ∈ (0,1) |
| Pmf | p(1−p)^{k−1} or (1−p)^k p |
| Mean | 1/p or (1−p)/p |
| Variance | (1−p)/p^2 or (1−p)/p^2 |
Geometric distribution is a discrete probability distribution describing the number of Bernoulli trials until the first success; it appears in reliability analysis, queueing theory, and information theory. The distribution connects to historical developments in probability by figures such as Pierre-Simon Laplace, Thomas Bayes, and Jacob Bernoulli, and it underpins models used by organizations like the Bell Labs and in applications by NASA and European Space Agency. It appears alongside classical results from Andrey Kolmogorov, Émile Borel, and Jerzy Neyman in statistical inference and stochastic processes.
A geometric law models the count of independent identical Bernoulli trials with success probability p until the first success; this formulation echoes foundational work by Christiaan Huygens, Gottfried Wilhelm Leibniz, Siméon Denis Poisson, and Adolphe Quetelet. The probability mass function (pmf) is p(1−p)^{k−1} for support {1,2,3,...} or (1−p)^k p for support {0,1,2,...}, reflecting conventions found in expositions by Karl Pearson, Ronald Fisher, Jerzy Neyman, and treatment in textbooks used at University of Cambridge and Harvard University. The memoryless property of the geometric law is analogous to the exponential law emphasized by William Feller and A. N. Kolmogorov.
The geometric law is the only discrete distribution with the memoryless property, a fact highlighted by Siegmund Freud-era probabilists and formalized in measure-theoretic probability by Andrey Kolmogorov and Paul Lévy. Its pmf is unimodal and strictly decreasing for p∈(0,1), and it is a special case of the negative binomial distribution, a relationship treated by Florence Nightingale-era statisticians and later by Harold Jeffreys and John Tukey. The hazard function is constant and equal to p, an observation used in reliability studies at institutions such as Los Alamos National Laboratory and in survival analyses discussed by Bradford Hill and Austin Bradford Hill.
Common parameterizations use support starting at 1 (trials until first success) or 0 (failures before first success), choices appearing in works by Thomas Bayes, Pierre-Simon Laplace, and modern texts from Princeton University and Massachusetts Institute of Technology. Interpretations include waiting time in discrete-time Markov chains studied by Andrey Kolmogorov and John von Neumann, modeling geometric lifetimes in reliability engineering practiced at General Electric and Siemens, and representing geometric-coded run lengths in information theory traditions from Claude Shannon and Richard Hamming.
The mean is 1/p (or (1−p)/p) and the variance is (1−p)/p^2, formulae used in inferential work by Jerzy Neyman and Egon Pearson and in applied studies at CERN and Bell Labs. The probability-generating function G(z)= pz/(1−(1−p)z) (or G(z)= p/(1−(1−p)z) for zero-based support) and the moment-generating function M(t)= pe^{t}/(1−(1−p)e^{t}) connect to transforms employed by Paul Lévy, Salomon Bochner, and analysts at Institute for Advanced Study. Cumulants and higher moments follow from expansions used in asymptotic analyses by Harold Hotelling and in stochastic modeling at RAND Corporation.
Maximum likelihood estimation of p is straightforward: for the one-based support, \hat p = 1/(\bar k), a result present in pedagogical materials from University of Oxford and Stanford University, and derived in statistical treatises by Ronald Fisher and Jerzy Neyman. Bayesian inference with a Beta prior Beta(α,β) yields a Beta(α+1,β+Σk−1) posterior for one-based counts, an approach popularized in contexts by Thomas Bayes and developed in modern Bayesian curricula at Columbia University and University of California, Berkeley. Hypothesis testing, confidence intervals, and goodness-of-fit methods for the geometric law leverage likelihood-ratio tests and chi-squared techniques formalized by Wilks and Karl Pearson.
The geometric distribution is a special case of the negative binomial distribution, a discrete analog of the exponential distribution studied by Simeon Poisson and Adrien-Marie Legendre, and it relates to the Pascal distribution historically analyzed by Blaise Pascal and Pierre de Fermat. Compound and mixed geometric forms arise in actuarial science at firms like Prudential Financial and Lloyd's of London and in queueing models from Kendall's classifications used by AT&T and Deutsche Telekom. Limit relations connect geometric sums to the exponential and to stable laws investigated by Paul Lévy and Gnedenko.