LLMpediaThe first transparent, open encyclopedia generated by LLMs

Max Cut

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Sanjeev Arora Hop 5
Expansion Funnel Raw 40 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted40
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Max Cut
NameMax Cut
ProblemCombinatorial optimization
InputGraph
OutputCut (partition of vertices)
ComplexityNP-hard

Max Cut Max Cut is a combinatorial optimization problem that asks for a partition of the vertex set of a graph into two parts that maximizes the number (or total weight) of edges between the parts. Originating in graph theory and theoretical computer science, the problem connects to graph algorithms, complexity theory, and optimization, and has motivated developments in approximation algorithms, semidefinite programming, and probabilistic methods.

Definition and formulation

The problem is defined on an undirected graph G = (V, E) (or a weighted graph with weight function w) where the goal is to find a subset S ⊆ V maximizing the sum of weights of edges with one endpoint in S and the other in V \ S. Standard formulations include an integer quadratic program and a cut-value expression using adjacency or Laplacian matrices; these formulations relate to relaxations used in algorithmic techniques by researchers associated with Princeton University, MIT, Bell Labs, and institutions where work on combinatorial optimization flourished. Equivalent formulations arise in spin glass models studied by groups linked to Los Alamos National Laboratory and IBM Research.

Complexity and computational hardness

Max Cut is NP-hard; its decision variant is NP-complete, placing it in the landscape developed by figures at Princeton University and University of California, Berkeley who contributed to complexity theory. The problem is APX-hard and, under the Unique Games Conjecture advanced by researchers at New York University and University of Toronto, optimal approximation thresholds have been characterized by scholars connected to Microsoft Research and Google Research. Hardness of approximation results were established in work associated with investigators from Stanford University and ETH Zurich, and reductions often employ gadgets and PCP techniques from groups at Carnegie Mellon University and University of Chicago.

Exact and approximation algorithms

Exact exponential-time algorithms and branch-and-bound methods have been developed in algorithmic research at INRIA, Bell Labs, and university groups such as California Institute of Technology and Columbia University. For approximation, polynomial-time heuristics like greedy cuts, local search, and spectral partitioning were advanced in research at University of Cambridge, Harvard University, and industrial labs including AT&T Labs. The seminal approximation algorithm achieving performance guarantees uses semidefinite programming and randomized rounding crafted by teams from Princeton University and IBM Research. Improvements and heuristic adaptations have been pursued by researchers at Yahoo! Research and Facebook AI Research for large-scale instances; parameterized complexity approaches were developed by investigators associated with University of Edinburgh and University of Tokyo.

Semidefinite programming and Goemans–Williamson algorithm

A landmark result is the semidefinite programming (SDP) relaxation and randomized hyperplane rounding method introduced by researchers at MIT and Princeton University, which yields a 0.878-approximation ratio. The SDP approach builds on convex optimization frameworks advanced at Stanford University and University of Washington and relates to interior-point methods popularized by researchers at IBM Research and Bell Labs. Subsequent analyses and stability results were produced by theoreticians affiliated with Cornell University and Columbia University, while connections to approximation resistance and integrality gaps were explored by groups at Yale University and University of Illinois Urbana–Champaign.

Special graph classes and properties

For particular graph families studied by combinatorialists at Oxford University and Princeton University, the Max Cut problem admits polynomial-time solutions or tighter bounds: bipartite graphs trivialize the problem, planar graphs connect to duality results developed at ETH Zurich and École Polytechnique, and bounded-treewidth graphs are amenable to dynamic programming techniques from researchers at University of California, Santa Barbara and Duke University. Regular graphs, Cayley graphs, and expanders—objects of interest at Institute for Advanced Study and Russian Academy of Sciences—have special spectral properties that inform approximation guarantees via eigenvalue methods linked to University of Bonn and Tel Aviv University.

Applications and practical uses

Max Cut formulations appear in statistical physics models studied at Los Alamos National Laboratory and University of Illinois Urbana–Champaign, in VLSI design and circuit partitioning problems addressed at Bell Labs and Intel Corporation, and in image segmentation and computer vision work from groups at Carnegie Mellon University and University of Toronto. In machine learning and data mining, cut-based objectives are used for clustering and community detection in networks examined by teams at Facebook and Google, while financial portfolio diversification problems and communication network design have leveraged cut formulations in collaborations involving Goldman Sachs and AT&T Labs.