Generated by GPT-5-mini| Gamma distribution | |
|---|---|
![]() Dgamma25 · CC BY-SA 4.0 · source | |
| Name | Gamma distribution |
| Type | continuous |
| Support | (0, \infty) |
| Parameters | shape k, scale θ |
| Mean | kθ |
| Variance | kθ^2 |
Gamma distribution The Gamma distribution is a two-parameter family of continuous probability distributions widely used in Bayes' theorem-based Bayesian analysis, Poisson distribution modeling, and lifetime modeling in reliability studies associated with Wilcoxon signed-rank test datasets. Developed through contributions by Karl Pearson, Andrey Markov, and Harald Cramér, it serves as a conjugate prior for exponential-family likelihoods and connects to the work of Ronald Fisher and Jerzy Neyman on statistical estimation.
The distribution is defined for x>0 with a probability density function parameterized by a positive shape parameter often denoted k (or α) and a positive scale parameter θ (or inverse rate β), paralleling parameter roles explored by William Gosset in sample theory and discussed in expositions by Émile Borel. The standard form uses the gamma function Γ(·), itself studied by Leonhard Euler and appearing in results investigated by Srinivasa Ramanujan and Adrien-Marie Legendre. Alternate parameterizations employ a rate parameter λ=1/θ as used in texts influenced by Thomas Bayes and in applied work by Karl Pearson's school.
The distribution's moment-generating function, characteristic function, and cumulants follow directly from Γ(·), linking to classical analysis by Augustin-Louis Cauchy and analytic continuation techniques used by Bernhard Riemann. Additivity of independent Gamma variates with equal scale is a property utilized in stochastic process models developed by Andrei Kolmogorov and Norbert Wiener. The distribution's maximum-likelihood estimators and Fisher information matrix feature in asymptotic theory as treated by Jerzy Neyman and Egon Pearson, and its tail behavior and hazard rate functions are compared in reliability contexts studied by William Sealy Gosset and Frank Wilcoxon.
When the shape parameter equals 1 the distribution reduces to the exponential distribution associated with Poisson process interarrival times, a process formalized by Agner Krarup Erlang. Half-integer shapes relate to chi-square distributions used in inferential procedures originated by Gustav Fisher and further developed by Karl Pearson; specifically, the chi-square with ν degrees of freedom is a Gamma with shape ν/2 and scale 2, linking to the work of Sir Ronald A. Fisher and Jerzy Neyman. Connections to the Erlang distribution reflect applications in teletraffic engineering from studies by A. K. Erlang and extensions to the Nakagami distribution appear in communications research influenced by Claude Shannon. The gamma is also a building block for the generalized gamma distribution and connects to the beta prime and inverse gamma families discussed in monographs by Harold Jeffreys.
Parameter estimation commonly uses maximum likelihood estimation (MLE) with iterative solvers such as Newton–Raphson, algorithms whose convergence theory was advanced by Isaac Newton and formalized in numerical analysis by John von Neumann. Method-of-moments estimators link to classical sampling theory from William Sealy Gosset (Student) and confidence intervals derive from chi-square and likelihood-ratio tests within the Neyman–Pearson framework. Bayesian inference employs conjugate priors and hierarchical models influenced by Thomas Bayes and practical implementations in fields shaped by Donald Rubin and Andrew Gelman.
Applications span survival analysis in medical studies reminiscent of randomized trial designs by Austin Bradford Hill, rainfall modeling in hydrology following methods used by John Wesley Powell, insurance claim severity modeling influenced by work at Lloyd's of London, and queuing theory emblematic of analyses in telephone traffic by A. K. Erlang. In engineering, gamma-based lifetime models inform maintenance planning in industries studied under Frederick Winslow Taylor principles. Environmental statisticians adopting techniques from Winston Churchill-era civil projects (e.g., flood control) also use gamma fits, and actuarial science curricula at institutions like Princeton University and London School of Economics teach gamma-related models.
Efficient computation of the gamma function and incomplete gamma integrals underpins evaluation of cumulative distribution functions and quantiles, relying on algorithms refined by George Forsythe and implemented in libraries influenced by standards from IEEE 754 floating-point arithmetic. Software implementations appear in statistical packages from organizations such as The R Project for Statistical Computing and commercial suites used at Microsoft Research and IBM Research; numerical recipes and special-function modules derive from work by William H. Press and collaborators. Monte Carlo and Markov chain Monte Carlo techniques for Gamma-based hierarchical models draw on methods developed by Alan G. Wilson and popularized in modern form by Radford Neal and Andrew Gelman.