Generated by GPT-5-mini| Mathematical Programming Computation | |
|---|---|
| Name | Mathematical Programming Computation |
| Discipline | Operations research; Applied mathematics; Computer science |
| First appeared | 20th century |
| Notable figures | John von Neumann; George Dantzig; Richard Bellman; Karmarkar; Leonid Khachiyan |
Mathematical Programming Computation is the computational study of algorithms, software, and numerical methods for solving optimization models arising in industry, science, and government. It connects theoretical advances from John von Neumann, George Dantzig, and Richard Bellman with practical implementations used in contexts such as National Aeronautics and Space Administration, General Electric, and Goldman Sachs. The field draws on results associated with Leonid Khachiyan, Nikolai Krylov, and Narendra Karmarkar while engaging with communities around INFORMS, SIAM, and journals like Operations Research (journal).
Early computational efforts trace to algorithms inspired by John von Neumann's work and the linear programming tableau of George Dantzig, with wartime projects linked to Los Alamos National Laboratory and RAND Corporation. The polynomial-time breakthrough of Leonid Khachiyan followed theoretical foundations influenced by Kurt Gödel and practical concerns from Bell Labs and IBM. The advent of interior-point methods credited to Narendra Karmarkar spurred implementations at firms like AT&T and research at Princeton University. Subsequent decades saw integration with high-performance computing projects at Argonne National Laboratory, algorithmic complexity studies informed by Richard Karp, and standardization efforts promoted by Transportation Research Board committees and IEEE working groups.
Models include formulations rooted in the work of John von Neumann and Leonid Khachiyan: linear programming (LP) popularized by George Dantzig, integer programming (IP) connected to R. M. Karp's reductions, mixed-integer programming (MIP) used by McKinsey & Company consultants, nonlinear programming (NLP) developed alongside contributions from Richard Bellman's dynamic programming, convex programming studied in the spirit of Stefan Banach and L.V. Kantorovich, semidefinite programming linked to research at Bell Labs, and stochastic programming advanced in collaborations with Warren Buffet-linked firms and academic centers like MIT. Specialized classes include combinatorial optimization problems analyzed by Éva Tardos and network flow problems with roots in studies at Bell Labs and AT&T.
Algorithmic paradigms combine simplex methods originating from George Dantzig with interior-point techniques introduced by Narendra Karmarkar and enhanced by researchers at Bell Labs and Princeton University. Branch-and-bound and branch-and-cut methods reflect combinatorial advances linked to Richard Karp and implementations inspired by teams at IBM and Microsoft Research. Cutting-plane strategies leverage theory from Kurt Gödel-inspired logic and polyhedral studies associated with William Thurston-era geometry; primal-dual and augmented Lagrangian methods have roots in work at Los Alamos National Laboratory and Lawrence Berkeley National Laboratory. Heuristic and metaheuristic approaches draw on contributions from John Holland and empirical testing in industry partners like Siemens and General Electric.
Major solvers include commercial systems developed by IBM (CPLEX), Gurobi Optimization (Gurobi), and open-source projects seeded by academic groups at MIT and INRIA (GLPK, COIN-OR). Modeling languages and environments trace heritage to AMPL creators and standards influenced by workshops at SIAM and INFORMS. High-performance implementations utilize computing platforms developed at Lawrence Livermore National Laboratory and Argonne National Laboratory, while cloud-based services reflect deployments by Amazon Web Services and collaborations with Google research teams. Software ecosystems incorporate linear algebra libraries from Netlib and compiler toolchains from GNU Project and Microsoft.
Complexity results echo foundational theorems by Leonid Khachiyan and Richard Karp establishing polynomial-time and NP-hardness separations, while average-case and smoothed analyses draw on techniques advanced at Princeton University and Stanford University. Performance evaluation leverages benchmarking methodologies originating in studies at Bell Labs and Los Alamos National Laboratory, with statistical protocols influenced by researchers at Harvard University and Carnegie Mellon University. Parallel and distributed algorithm performance is assessed on architectures from Cray Research and Intel and modeled after scalability frameworks developed at Oak Ridge National Laboratory.
Standard test collections include suites maintained by Netlib and datasets distributed through efforts at COIN-OR and TSPLIB. Benchmarking campaigns have been organized at conferences hosted by INFORMS and SIAM and in challenge problems sponsored by NASA and DARPA. Problem generators and instances derive from industrial partners such as Siemens, ABB Group, and Schneider Electric, and from academic repositories curated by University of California, Berkeley and Massachusetts Institute of Technology research groups.
Applications span supply chain optimizations used by Walmart and Amazon (company), portfolio optimization practiced at Goldman Sachs and Morgan Stanley, energy system planning involving Shell and ExxonMobil, and transportation scheduling studied with inputs from Federal Aviation Administration and Deutsche Bahn. Case studies demonstrate implementations in aerospace projects at NASA and SpaceX, logistics solutions deployed at UPS and DHL, and healthcare resource allocations explored at Mayo Clinic and Johns Hopkins Hospital.
Category:Optimization