LLMpediaThe first transparent, open encyclopedia generated by LLMs

Quadratic programming

Generated by DeepSeek V3.2
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: SVR Hop 4
Expansion Funnel Raw 60 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted60
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()

Quadratic programming is a specialized branch of mathematical optimization that focuses on problems where the objective function is a quadratic function and the constraints are linear. It is a fundamental subclass of nonlinear programming and a generalization of both linear programming and least squares optimization. The field finds extensive use in disciplines ranging from financial economics to engineering design, where modeling relationships with squared terms is essential.

Definition and formulation

The standard form minimizes a convex quadratic objective subject to linear equality and inequality constraints. A canonical formulation is to minimize \tfrac{1}{2} \mathbf{x}^\top Q \mathbf{x} + \mathbf{c}^\top \mathbf{x} subject to A\mathbf{x} \leq \mathbf{b} and often E\mathbf{x} = \mathbf{d}, where Q is a symmetric positive-definite matrix, \mathbf{x} denotes the vector of decision variables, and A and E are matrices defining the constraints. This structure ensures the Karush–Kuhn–Tucker conditions provide necessary and sufficient conditions for a global optimum when the Hessian matrix Q is positive semidefinite, making the problem convex. Important special cases include portfolio optimization problems, such as those in Harry Markowitz's modern portfolio theory, and support vector machine training in machine learning.

Algorithms

Numerous specialized algorithms have been developed, often exploiting the problem's structure. For convex problems, interior point methods, pioneered by researchers like Narendra Karmarkar for linear programming, are highly efficient. The active set method is another classical approach that iteratively identifies binding constraints. For large-scale or sparse problems, conjugate gradient methods and augmented Lagrangian methods are frequently employed. Commercial and open-source solvers like CPLEX, Gurobi, and MOSEK implement these advanced techniques, while the quadratically constrained quadratic program is a more general class solved by extensions of these algorithms. The development of these methods is closely tied to advances in numerical linear algebra and computational complexity theory.

Applications

Applications are pervasive in science, finance, and industry. In financial economics, it is the computational core of mean-variance optimization for asset allocation, a cornerstone of work by Harry Markowitz. Within engineering, it is used for optimal control problems, such as in model predictive control systems prevalent in the automotive industry and chemical process plants. In statistics and machine learning, training support vector machines and performing ridge regression are quintessential examples. Other uses include image processing and signal processing tasks, like filter design, and solving subproblems within general nonlinear programming algorithms like sequential quadratic programming.

Complexity

For convex problems where the matrix Q is positive semidefinite, the problem is polynomial-time solvable and is considered tractable. This places it within the complexity class P, with interior point methods typically exhibiting polynomial-time convergence. However, if the objective is indefinite (making the problem non-convex), the general problem becomes NP-hard, as proven through reductions from problems like the maximum clique problem. This dichotomy has driven significant research in global optimization and combinatorial optimization to handle the non-convex case, linking the field to challenges in theoretical computer science.

Extensions and variations

Several important extensions generalize the core model. The quadratically constrained quadratic program allows quadratic constraints, significantly expanding modeling capability at the cost of increased complexity. Second-order cone programming and semidefinite programming are more expressive convex frameworks that encompass it as a special case. Mixed-integer quadratic programming incorporates discrete variables, crucial for applications in facility location and production planning. Sequential quadratic programming is a dominant iterative method for general nonlinear programming. Furthermore, robust optimization and stochastic programming frameworks adapt these models to account for uncertainty in parameters, with applications in risk-averse portfolio optimization under the influence of theorists like R. Tyrrell Rockafellar.

Category:Mathematical optimization Category:Convex optimization Category:Operations research