Generated by GPT-5-mini| orthogonal polynomials | |
|---|---|
| Name | Orthogonal polynomials |
| Field | Mathematics |
| Introduced | 19th century |
| Notable | Carl Friedrich Gauss; Pafnuty Chebyshev; Adrien-Marie Legendre; Nikolai Lobachevsky; Henri Lebesgue |
orthogonal polynomials Orthogonal polynomials form sequences of polynomials with pairwise orthogonality under an inner product defined by a measure; they appear across analysis, mathematical physics, and computational methods. Originating in works by Carl Friedrich Gauss, Adrien-Marie Legendre, and Pafnuty Chebyshev, these sequences underpin techniques used by Alan Turing, John von Neumann, and David Hilbert. Modern development links contributions from Sofia Kovalevskaya, Émile Borel, Andrey Kolmogorov, and institutions like the Royal Society, Académie des sciences, and Institute for Advanced Study.
An orthogonal polynomial sequence is a sequence {p_n(x)} where each p_n has degree n and satisfies ⟨p_m, p_n⟩ = 0 for m ≠ n with respect to an inner product defined by a measure μ; foundational work was influenced by Karl Weierstrass, Augustin-Louis Cauchy, and Bernhard Riemann. Key algebraic properties derive from the Favard theorem proved with methods related to Émile Picard and concepts developed by David Hilbert and Erhard Schmidt. Norms, leading coefficients, and zeros exhibit interlacing and real-root behavior studied by Gustav Kirchhoff, Siméon Denis Poisson, and Joseph Fourier. Orthogonality under positive measures connects to moment problems investigated by Stieltjes and Thomas Jan Stieltjes and to spectral analysis promoted by John von Neumann and Israel Gelfand.
Classical families include Legendre, Chebyshev, Hermite, and Laguerre polynomials, with namesakes Adrien-Marie Legendre, Pafnuty Chebyshev, Charles Hermite, and Edmond Laguerre. Associated families such as Jacobi polynomials relate to work by Carl Gustav Jacob Jacobi and have extensions tied to Srinivasa Ramanujan and Niels Henrik Abel. Orthogonal systems on discrete sets, like Charlier, Meixner, Krawtchouk, and Hahn families, trace to researchers including Thomas Muir, Percival Lowell, and applications used by Ada Lovelace and George Stigler. Generalizations such as q-analogues connect to Richard Askey, James Ernest Littlewood, and the Askey scheme developed with input from Gasper and Rahman and Richard Feynman-inspired combinatorial methods.
Weight functions defining inner products are central: for Legendre the weight is 1 on [-1,1] with roots tied to quadrature developed by Carl Friedrich Gauss, while Hermite uses e^{-x^2} on (-∞,∞) linking to Paul Dirac and Erwin Schrödinger techniques. Measures may be continuous, discrete, or singular, and moment problems were advanced by Thomas Stieltjes, Marcel Riesz, and Frigyes Riesz. Nonclassical weights include varying exponential or oscillatory measures studied in work by John Nash, André Weil, and numerical analysts from National Bureau of Standards and Courant Institute. Orthogonality relative to measures on contours relates to investigations by Bernhard Riemann and applications in scattering theory from Werner Heisenberg.
Three-term recurrence relations characterize orthogonal polynomials via coefficients tied to spectral measures, a structure formalized by Favard and elaborated by Hermann Weyl and Max Born. Classical families satisfy second-order differential equations: Legendre’s equation, Hermite’s equation, and Laguerre’s equation, with derivations using methods from Sophie Germain, Joseph Liouville, and Galois-theoretic ideas influenced by Évariste Galois. Discrete orthogonal polynomials satisfy difference equations linked to Émile Borel and later combinatorial frameworks by Paul Erdős. Orthogonal polynomials’ recurrence matrices correspond to Jacobi and Hessenberg operators studied in spectral theory by Marshall Stone and John von Neumann.
Gaussian quadrature, spectral methods, and approximation theory exploit orthogonal polynomials; Gaussian quadrature traces to Carl Friedrich Gauss and is implemented in algorithms from Alan Turing and John Backus’s computational efforts. Polynomial approximation underpins methods in works by Andrey Kolmogorov, Sergei Bernstein, and Norbert Wiener and is used in finite element methods developed at the Courant Institute and by Ilya Prigogine. Signal processing and data compression use orthogonal bases inspired by Claude Shannon and Harry Nyquist, while stochastic processes applications reference Andrei Kolmogorov and Dennis Gabor. Numerical stability, conditioning, and fast transforms leverage research from Gene Golub, J. H. Wilkinson, and computational libraries originating at Los Alamos National Laboratory and Lawrence Livermore National Laboratory.
Orthogonal polynomials interlink with hypergeometric and basic hypergeometric functions studied by Karl Weierstrass, Bernard Riemann, and W. N. Bailey; the Askey scheme synthesizes these connections through the work of Richard Askey and collaborators. Representation-theoretic interpretations connect polynomials to group representations of Élie Cartan, Hermann Weyl, and quantum groups explored by Vladimir Drinfeld and Michio Jimbo. Applications to integrable systems, random matrix theory, and combinatorics involve contributions from Freeman Dyson, Amitai Regev, Mark Kac, and the Institute for Advanced Study community. Orthogonal polynomials also appear in modern algebraic geometry and knot invariants studied by Michael Atiyah, Edward Witten, and Maxim Kontsevich.