LLMpediaThe first transparent, open encyclopedia generated by LLMs

SVD

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Spetsnaz GRU Hop 4
Expansion Funnel Raw 80 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted80
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
SVD
NameSVD
CaptionSingular value decomposition schematic
Invented byEinar Hille, John von Neumann, Carl Eckart, G. Young
First published1907–1936
FieldLinear algebra, Numerical analysis, Functional analysis

SVD

SVD is a matrix factorization widely used in David Hilbert-inspired functional analysis and 20th‑century linear algebra that expresses a rectangular matrix as a product of three matrices revealing geometric, numerical, and statistical structure. It links classical results by Eugenio Beltrami, Friedrich Schur, John von Neumann, Issai Schur, and later computational advances by Gene Golub, James Wilkinson, and Lloyd Trefethen. The decomposition underpins methods in Claude Shannon-era signal processing, Andrey Kolmogorov-influenced approximation theory, and modern Geoffrey Hinton-style machine learning.

Definition and overview

The singular value decomposition represents an m×n matrix A over the real or complex field as A = U Σ V*, where U is m×m unitary, Σ is m×n diagonal with nonnegative entries (the singular values), and V* is the conjugate transpose of an n×n unitary V. Key figures associated with concepts surrounding orthonormal bases include David Hilbert, John von Neumann, Issai Schur, Emil Artin, and Errett Bishop. Singular values generalize eigenvalues studied by Carl Friedrich Gauss and Leonhard Euler and connect to matrix norms used in work by Alfred Haar and James H. Wilkinson.

Mathematical formulation

For A ∈ R^{m×n} (or C^{m×n}), there exist orthonormal sets of left singular vectors {u_i} in R^m and right singular vectors {v_i} in R^n such that A v_i = σ_i u_i and A^* u_i = σ_i v_i, with σ_1 ≥ σ_2 ≥ ... ≥ 0. The nonzero σ_i are square roots of the nonzero eigenvalues of A^* A and A A^*. Classical theorems by Issai Schur and spectral results by John von Neumann and Hermann Weyl underpin existence and ordering; perturbation bounds relate to Alan Turing-era stability concepts and results by Egon Balas and László Lovász in combinatorial contexts. The Eckart–Young theorem, proved by Carl Eckart and Gunnar Young, characterizes best low-rank approximations in the Frobenius norm via truncated decompositions.

Properties and theoretical results

Singular values are unitarily invariant and determine operator norms such as the spectral norm and nuclear norm. Weyl-type inequalities by Hermann Weyl and interlacing results relate singular values of submatrices to those of the full matrix; majorization and von Neumann’s trace inequalities provide ordering relations used by John von Neumann and explored in matrix analysis texts by Roger Horn and Charles R. Johnson. The compact-operator analog on Hilbert spaces links to the spectral theorem by David Hilbert and John von Neumann, while perturbation theory and Davis–Kahan-type results connect to work by Chandler Davis and William Kahan; condition numbers and backward error analysis were developed by James Wilkinson and Gene Golub.

Computation and algorithms

Numerical computation of SVD has been advanced by algorithms such as Golub–Kahan bidiagonalization, QR iteration, and divide-and-conquer methods. Key contributors include Gene H. Golub, William Kahan, James H. Wilkinson, Beresford Parlett, and Alan Edelman. Practical implementations appear in libraries developed by Numerical Algorithms Group, LAPACK, and language ecosystems around Fortran, MATLAB, Python’s NumPy, and Julia. Randomized algorithms for approximate SVD were popularized by researchers including Petros Drineas, Michael W. Mahoney, and Nathan Halko to handle large-scale data inspired by work in Google and academic groups at MIT and Stanford University.

Applications

SVD is central to principal component analysis used in statistics by Karl Pearson and later practitioners such as Harold Hotelling, to latent semantic analysis popularized in information retrieval by Scott Deerwester and Gerard Salton, and to collaborative filtering in recommender systems advanced by teams at Netflix. In signal processing it underlies methods from Norbert Wiener’s filter theory to modern Andrey Markov-influenced time‑series approaches; in control theory SVD appears in model reduction linked to Rudolf E. Kálmán and Luenberger. Image compression and denoising trace to work at Bell Labs and research by Irwin Goodman; quantum information exploits Schmidt decompositions related to SVD in studies by John Preskill and Peter Shor. Applications extend to computational biology in gene expression analysis at institutions like Broad Institute and to econometrics at Cowles Commission-influenced groups.

Variants and generalizations

Generalizations include compact operator singular value theory on Hilbert spaces (building on David Hilbert), the polar decomposition related to work by Émile Picard and Erhard Schmidt, the generalized singular value decomposition (GSVD) used in regularization contexts by Alan Turing-era statisticians and later by Per Christian Hansen and Jorge Moré, and tensor decompositions such as higher‑order SVD developed in multilinear algebra by Timothy G. Kolda and Brett W. Bader. Low-rank matrix completion and nuclear‑norm minimization draw on convex optimization advances by Yuan Yao and Emmanuel Candès.

Historical development and contributors

Early roots trace to 19th‑century spectral investigations by Évariste Galois-era mathematicians and to operator concepts of David Hilbert. Formal statements and proofs emerged in the early 20th century through contributions from Einar Hille, James Joseph Sylvester, Issai Schur, and culminated in explicit matrix decompositions by Carl Eckart and Gunnar Young in the 1930s. The mid‑20th century saw theoretical consolidation by John von Neumann and computational maturation through efforts by Alston Householder, Gene Golub, and James Wilkinson; algorithmic and software ecosystems expanded with contributions from Jack Dongarra and teams behind LAPACK. Contemporary research integrates ideas from Terence Tao-inspired random matrix theory and machine learning developments at University of California, Berkeley and Stanford University.

Category:Linear algebra