LLMpediaThe first transparent, open encyclopedia generated by LLMs

semidefinite programming

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Subhash Khot Hop 5
Expansion Funnel Raw 49 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted49
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
semidefinite programming
NameSemidefinite programming
TypeConvex optimization
FieldsOptimization, Applied mathematics, Control theory, Theoretical computer science
Introduced1990s
NotableInterior-point methods, Lovász theta function, Max-cut approximation

semidefinite programming is a class of convex optimization problems that generalizes linear programming and quadratic programming and plays a central role in modern optimization, control, and theoretical computer science. Rooted in the development of interior-point methods and matrix analysis, it connects to landmark results and institutions across mathematics and engineering, and has been applied by researchers at institutions such as Princeton University, Massachusetts Institute of Technology, Stanford University, University of California, Berkeley, and Bell Labs.

Definition and problem statement

The canonical form involves optimizing a linear functional over the cone of symmetric positive semidefinite matrices subject to linear constraints; this formulation arose in work associated with Yale University and IBM Research during the 1990s and was popularized through contributions from researchers affiliated with Cornell University and Rutgers University. A standard primal instance minimizes trace(CX) subject to trace(A_i X)=b_i and X⪰0, where matrices C and A_i are symmetric and indexed in literature from SIAM conferences and journals edited by scholars at Princeton Plasma Physics Laboratory and Los Alamos National Laboratory. Practical modeling systems at organizations like MathWorks and Microsoft adopted such formulations for control design developed at Caltech and NASA laboratories.

Mathematical properties and duality

Semidefinite formulations exhibit strong duality under Slater-type conditions associated with interior-point theory advanced by researchers connected to IBM T.J. Watson Research Center and Bell Labs Innovations. The dual problem maximizes b^T y subject to C - sum_i y_i A_i ⪰ 0, with complementary slackness characterizations used in analysis at Harvard University and Yale University. Eigenvalue interlacing and spectral decomposition theorems from work influenced by John von Neumann and studied at Cambridge University and University of Oxford underpin feasibility and sensitivity analyses; perturbation results used in robust control trace back to seminars at ETH Zurich and Imperial College London.

Algorithms and solvers

Practical solution methods include interior-point algorithms whose complexity bounds echo results from Karmarkar’s work and were implemented by teams at Bell Labs and AT&T; large-scale first-order methods were later advanced by groups at Google and Facebook for machine learning tasks. Popular solvers such as SDPA, SeDuMi, and MOSEK originated from collaborations involving Tokyo Institute of Technology, Vrije Universiteit Amsterdam, and Chalmers University of Technology, while toolboxes integrating semidefinite capabilities were developed at MathWorks and in software maintained by researchers at ETH Zurich. Recent randomized and sketching methods for scalability have ties to research labs at Stanford University and Microsoft Research.

Applications

Semidefinite formulations have been applied to combinatorial optimization exemplified by the Lovász theta function used in work at Princeton University and to approximation algorithms for MAX-CUT influenced by results associated with Harvard University and MIT. In control theory they underpin linear matrix inequalities employed by engineers at NASA and General Electric for robust stabilization and H-infinity synthesis; signal processing deployments were advanced by teams at Bell Labs and Nokia, while quantum information tasks leverage semidefinite constraints in studies at Perimeter Institute and University of Waterloo. Applications in machine learning, including kernel methods and metric learning, proliferated through collaborations at Carnegie Mellon University, Yahoo! Research, and DeepMind.

Complexity and theoretical results

Complexity analyses relate semidefinite feasibility to classes studied at Princeton University and University of California, Berkeley; the polynomial-time solvability via interior-point methods links to foundational algorithmic work acknowledged by the ACM and researchers affiliated with Stanford University. Hardness results for specific structured relaxations were explored in research groups at University of Chicago and Columbia University, while integrality gap constructions and approximation limits have been developed in collaboration among scholars at Microsoft Research, ETH Zurich, and Tel Aviv University. Connections to spectral graph theory cite classic work from Bell Labs and University of Illinois at Urbana–Champaign.

Extensions include conic programming studies that blend semidefinite constraints with second-order cones researched at Princeton Plasma Physics Laboratory and multi-criteria formulations used by analysts at Duke University; sum-of-squares hierarchies and polynomial optimization built on semidefinite relaxations were elaborated by teams at Institute for Advanced Study and University of Toronto. Related problems such as positive semidefinite matrix completion, matrix factorization, and quantum state tomography have been advanced by researchers at IBM Research, Perimeter Institute, and Los Alamos National Laboratory.

Category:Optimization