Generated by GPT-5-mini| Matrix | |
|---|---|
| Name | Matrix |
| Type | Mathematical object |
| Field | Linear algebra, Multilinear algebra, Numerical analysis |
| Introduced | Antiquity–19th century |
| Notable | Arthur Cayley, James Joseph Sylvester, Carl Friedrich Gauss |
Matrix is a rectangular array of quantities arranged in rows and columns used to represent linear transformations, systems of linear equations, bilinear forms, and data structures. Matrices serve as a unifying language across Linear algebra, Functional analysis, Quantum mechanics, Computer graphics, and Statistics, enabling computation of transformations, decompositions, and spectral properties. They are described using notation for entries, indices, sizes, and special types that facilitate algebraic manipulation and numerical implementation.
A matrix is denoted by a capital letter and defined by its entries a_{ij} where i indexes rows and j indexes columns; common indexing conventions appear in works by Leonhard Euler and later formalizations by Carl Friedrich Gauss. The shape m×n indicates m rows and n columns, and special symbols denote operations: the transpose A^{T}, the inverse A^{-1} for nonsingular square matrices, the determinant det(A), and the trace tr(A); these conventions are standardized in texts by David Hilbert, Emmy Noether, and John von Neumann. Notation for block matrices, Kronecker products, and Einstein summation notation appears in treatments associated with James Clerk Maxwell, Hermann Grassmann, and Albert Einstein; index notation is prevalent in studies connected to Élie Cartan and Bernhard Riemann. Commonly used basis-dependent representations employ canonical bases introduced in the pedagogy of Augustin-Louis Cauchy and algorithmic descriptions in publications from Ada Lovelace-era proto-computation.
Matrices come in many named varieties with algebraic and spectral properties: square matrices, diagonal matrices, scalar matrices, identity matrices, triangular matrices, symmetric matrices, skew-symmetric matrices, orthogonal matrices, unitary matrices, Hermitian matrices, positive definite matrices, stochastic matrices, sparse matrices, and rank-deficient matrices; classical classification traces through the work of Arthur Cayley and James Joseph Sylvester. Spectral properties involve eigenvalues and eigenvectors as studied by Joseph Fourier-era analysis and developed by David Hilbert and John von Neumann for compact operators; the spectral theorem for normal matrices generalizes diagonalization results found in Augustin-Louis Cauchy and Évariste Galois contexts. Structural invariants such as rank, determinant, characteristic polynomial, minimal polynomial, and singular values underpin classification schemes used in research by Évariste Galois and modern expositions from Alonzo Church-adjacent logic of computability. Special matrices like Toeplitz, circulant, Hankel, Vandermonde, companion, and incidence matrices appear in algebraic combinatorics and signal processing literature influenced by Leonhard Euler, Brook Taylor, and Gustav Kirchhoff.
Matrix addition, scalar multiplication, and matrix multiplication define an associative algebra over a field; these algebraic structures were formalized in the 19th century through work by Arthur Cayley and Hermann Grassmann. Composition corresponds to multiplication, and noncommutativity leads to rich theory connecting to Group theory results explored by Évariste Galois and Sophus Lie in continuous symmetry contexts. Inversion, LU decomposition, QR decomposition, Cholesky decomposition, singular value decomposition (SVD), and Jordan normal form are algorithmic and structural tools developed across contributions from Carl Friedrich Gauss, Alfred Noble, John von Neumann, and Camille Jordan. Determinant properties, Cramer's rule, adjugate matrices, and minors link to elimination methods of Carl Friedrich Gauss and matrix-tree theorems associated with Gustav Kirchhoff. Commutator brackets and Lie algebra structures appear in investigations by Sophus Lie and later applications to representation theory studied by Élie Cartan.
Matrices appear ubiquitously: in solving linear systems arising in Carl Friedrich Gauss's least squares and geodesy, in discretizations of partial differential equations studied by Jean le Rond d'Alembert and Joseph-Louis Lagrange, in quantum mechanics formulations by Erwin Schrödinger and Paul Dirac, and in state-space models of Norbert Wiener-inspired control theory. In computer science, matrices underpin algorithms in Alan Turing-inspired computability, machine learning architectures developed in modern research communities, and graphics pipelines used in James Foley-era computer graphics and Tim Sweeney-era engines. In statistics and econometrics, covariance matrices and design matrices are central to methods advanced by Ronald Fisher and Trygve Haavelmo. Network analysis, graph Laplacians, and adjacency matrices are foundational for studies in Paul Erdős-inspired combinatorics and modern social-network research. Signal processing employs convolution matrices, Toeplitz structures, and Fourier-related matrices linking to Joseph Fourier and Claude Shannon.
Early implicit uses of arrays date to Chinese mathematics and Indian mathematics traditions, while systematic symbolic treatment emerged in the 17th–19th centuries with contributions from Gottfried Wilhelm Leibniz, Gabriel Cramer, and Carl Friedrich Gauss. The explicit term and algebraic theory were advanced by Arthur Cayley and James Joseph Sylvester in mid-19th-century London, followed by structural and spectral generalizations by Camille Jordan and David Hilbert. Twentieth-century developments tied matrix theory to operator theory and quantum mechanics through John von Neumann and Paul Dirac, and numerical linear algebra matured with algorithmic frameworks from Alan Turing and later computational mathematicians. Contemporary expansions integrate matrices into data science, optimization, and high-performance computing driven by institutions such as National Aeronautics and Space Administration research programs and industrial advances from companies influenced by John McCarthy-era artificial intelligence work.