Generated by GPT-5-mini| Wiener process | |
|---|---|
![]() | |
| Name | Wiener process |
| Other names | Brownian motion (mathematical) |
| Type | Stochastic process |
| Parameter | time t ≥ 0 |
| State space | ℝ^n |
| Properties | continuous paths, independent increments, Gaussian increments, martingale |
Wiener process is the mathematical idealization of random continuous motion originally studied in physics and chemistry related to pollen motion observed by Robert Brown and subsequently formalized in probability theory by Norbert Wiener and developed using measure-theoretic techniques influenced by Andrey Kolmogorov and Paul Lévy. It provides a canonical example in the theories of Kolmogorov's foundations, Itô calculus, and functional analysis related to the Banach space structure of path spaces, and it underpins models in Bachelier's finance ideas and Einstein's diffusion theory.
A standard Wiener process is a real-valued stochastic process {W(t): t ≥ 0} defined on a probability space associated with a filtration satisfying W(0)=0, independent increments, Gaussian increment distributions with mean zero and variance proportional to elapsed time, and almost surely continuous paths, reflecting axioms used by Andrey Kolmogorov in his consistency theorem and by Kolmogorov's extension arguments. The process is a continuous martingale relative to its natural filtration and enjoys the Markov property and strong Markov property which are central in connections to the theory of Doob and potential theory developed by Lawler and Garnett. Scaling invariance and time-homogeneity relate the Wiener process to groups of dilations studied in Cartan-style geometric analysis.
Canonical constructions include the Kolmogorov extension construction using finite-dimensional Gaussian distributions with covariance min(s,t), the construction via the continuous mapping of a sequence of Gaussian random variables (Karhunen–Loève expansion) linked to Hilbert-space eigenfunctions and Mercer kernels studied by Mercer, and representation as time-changed Poissonian limits related to invariance principles such as the Donsker invariance principle proved by Feller and refined by Hsu. Other representations use orthonormal bases in L^2([0,1]) as in the work of Bourbaki-style functional analytic treatments or via stochastic integrals against Brownian motion in Itô theory, developed by Kiyoshi Itō and advanced by Malliavin.
Finite-dimensional distributions are multivariate normal determined by the covariance kernel min(s,t), a fact leveraged by Andrey Kolmogorov in existence proofs and used in studies of hitting times and local times examined by Yor and Rosen. Almost sure properties of sample paths include nowhere differentiability and fractal behavior quantified by Hausdorff dimension results attributed to works influenced by Perec-style geometric measure theory and rigorous results from Falconer and Lévy. Law of the iterated logarithm results due to Kolmogorov and Khinchin describe limsup fluctuations, while hitting probabilities, polar sets, and boundary behavior are treated using methods from Hunt and potential theory in probabilistic formalisms developed by Doob.
The Wiener process is the prototypical driving noise in Itô calculus where stochastic integrals and Itô's formula form the backbone of stochastic differential equations studied by Kiyoshi Itō and Watanabe; its quadratic variation equals time, a property fundamental to martingale representation theorems proved by Meyer and Revuz. Girsanov's theorem connecting change of measure and drift transformations was developed by Girsanov, and the connection between Brownian martingales and harmonic functions traces back to classical potential theory and work of Varadhan and Dynkin. The Cameron–Martin theorem, named for Cameron and Martin, characterizes shifts of Wiener measure and underlies large deviations and Gaussian measure analysis applied by Varadhan.
Wiener-based models appear across diffusion models in physics inspired by Einstein and Smoluchowski, option pricing models initiated by Black and Scholes and extended by Merton, population genetics models related to Wright and Kimura, and filtering theory such as the Kalman filter framework extended to continuous noise settings by others like Kushner. Variants include reflected Brownian motion studied by Harrison and Harrison, fractional Brownian motion introduced by Mandelbrot and Van Ness, multi-parameter Brownian sheets examined by Mortimer-style analysts, and Lévy processes studied by Lévy and Itō generalize jump behavior seen in Kolmogorov-inspired limits.
Simulation techniques include discrete-time random walk approximations justified by the Donsker invariance principle, Karhunen–Loève truncations used by numerical analysts influenced by Wiener-era expansions, and stochastic differential equation solvers such as the Euler–Maruyama and Milstein schemes analyzed within the framework established by Kloeden and Platen. Variance reduction techniques and quasi-Monte Carlo methods reference practitioners from Glasserman-style computational finance and numerical probability, while multilevel Monte Carlo approaches owe development to work by Giles. Convergence and stability analyses often draw on results from Kurtz and weak convergence frameworks due to Billingsley.