Generated by GPT-5-mini| log-normal distribution | |
|---|---|
| Name | Log-normal distribution |
| Support | (0, ∞) |
| Parameters | μ (real), σ > 0 |
| Mean | exp(μ + σ^2/2) |
| Variance | [exp(σ^2) − 1] exp(2μ + σ^2) |
| Skewness | [exp(σ^2) + 2] sqrt{exp(σ^2) − 1} |
| Kurtosis excess | exp(4σ^2) + 2 exp(3σ^2) + 3 exp(2σ^2) − 6 |
log-normal distribution The log-normal distribution describes a positive-valued random variable whose logarithm is normally distributed. Originating in studies by François Fatio de Duillier, Ralph A. Bradley, and formalized by Frank Yates and Ludwig von Mises, it models multiplicative processes and arises across natural sciences and engineering. It is used in fields including Harvard University-affiliated research, World Health Organization analyses, and industrial modeling at General Electric and Siemens AG.
A random variable X is log-normally distributed if ln X ~ N(μ, σ^2). Important parameters μ and σ are real-valued location and positive-scale parameters linked to observational contexts at institutions like Bell Labs or Massachusetts Institute of Technology. The support is (0, ∞), the distribution is right-skewed for σ > 0, and it is closed under multiplication of independent log-normal variates—a property exploited in studies by Paul Lévy and Andrey Kolmogorov. The mode, median, and mean follow simple relations to μ and σ and are used in empirical work at organizations such as National Institutes of Health and NASA.
The probability density function (pdf) for x > 0 is f(x) = (1/(x σ sqrt{2π})) exp(−(ln x − μ)^2/(2σ^2)). The cumulative distribution function (cdf) is F(x) = Φ((ln x − μ)/σ), where Φ denotes the standard normal cdf familiar from texts by Carl Friedrich Gauss, Andrei Kolmogorov, and William Sealy Gosset. These formulae are implemented in statistical software developed by R Project, The MathWorks, Inc., and Python Software Foundation-backed libraries used at Stanford University and Princeton University.
Moments: E[X^k] = exp(k μ + ½ k^2 σ^2) for real k, yielding mean exp(μ + σ^2/2) and variance [exp(σ^2) − 1] exp(2μ + σ^2). The moment-generating function does not exist for all t > 0, but the characteristic function and Laplace transform relate to integrals of log-normal forms studied in research at Imperial College London and ETH Zurich. Measures such as skewness and kurtosis are functions of σ and are used in analyses by Goldman Sachs and climatology groups at Scripps Institution of Oceanography.
Common estimators transform data via logarithm and apply normal-theory techniques: μ̂ = mean(ln x_i) and σ̂^2 = var(ln x_i) (MLE for uncensored samples). Bias corrections and interval estimation appear in work from National Bureau of Economic Research and statistical packages from SAS Institute. Censoring, truncation, and Bayesian approaches using priors from groups like Bayesians for Public Policy or algorithms pioneered at Carnegie Mellon University require specialized likelihood functions and Markov chain Monte Carlo methods popularized in projects at University of California, Berkeley.
Mechanisms generating log-normality include multiplicative central limit phenomena described by Bruno de Finetti and growth processes studied by Alfred J. Lotka and Harold Hotelling. Applications span particle size distributions in materials science at MIT, income and wealth studies at International Monetary Fund, stock volatility modeling in financial institutions such as Goldman Sachs and J.P. Morgan Chase, and bioassay measurements in laboratories at Centers for Disease Control and Prevention. Environmental concentrations measured by United Nations Environment Programme and signal amplitude models in telecommunications at Nokia and Ericsson also employ log-normal fits.
The log-normal connects to the normal via the logarithm, to the folded normal in certain transformations studied at University of Cambridge, and to the Pareto and Weibull distributions through tail approximations used in actuarial work at Lloyd's of London. Compound distributions and mixtures involving log-normal components are used in models developed at Prudential Financial and in reliability analyses by Boeing and Airbus. The inverse log-normal arises under reciprocation and finds use in hydrology research at U.S. Geological Survey.
Multivariate log-normal distributions arise when a vector's logarithm is multivariate normal; covariance structure estimation is standard in portfolio theory at BlackRock and risk modeling at Munich Re. Generalizations include the shifted log-normal, truncated log-normal, and log-skew-normal families developed in academic work at University of Chicago and Columbia University, and nonparametric mixtures used in genomics at Broad Institute.