Generated by GPT-5-mini| Lin–Kernighan heuristic | |
|---|---|
| Name | Lin–Kernighan heuristic |
| Inventors | Shen Lin, Brian W. Kernighan |
| Introduced | 1973 |
| Problem | Traveling Salesman Problem |
| Type | Heuristic, Local search |
| Implementation | C, C++, Fortran, Java, Python |
Lin–Kernighan heuristic is a variable-depth local search heuristic for the Traveling Salesman Problem introduced by Shen Lin and Brian W. Kernighan in 1973. It generalizes k-opt moves used in combinatorial optimization and has been influential in the design of practical solvers for large instances encountered in Operations Research, Computer Science, and industrial routing tasks. The method influenced later work by researchers at institutions such as Bell Labs, Carnegie Mellon University, and Massachusetts Institute of Technology.
The heuristic was published during an era marked by advances at Bell Laboratories and overlapping research at Princeton University and Stanford University on heuristic methods for NP-hard problems like the Traveling Salesman Problem and Graph Theory. Early implementations were compared with methods by Christofides, Held and Karp, and algorithms derived from work by Littlewood and Dantzig while subsequent enhancements drew on contributions from researchers affiliated with IBM Research, AT&T Bell Labs, and McGill University. Throughout the 1980s and 1990s, the technique was refined alongside contributions from groups at University of Waterloo, University of California, Berkeley, and University of Chicago, leading to variants used in academic competitions such as the DIMACS Implementation Challenges and benchmark suites maintained by TSPLIB curators.
The method extends the concept of k-opt exchanges by selecting a sequence of edges to remove and replace guided by gain-based criteria inspired by work at Bell Labs and analysis comparable to insights from Donald Knuth and Jon Kleinberg. At each step the algorithm constructs alternating chains of removed and added edges, evaluates gain similar to techniques developed by Jack Edmonds in Matching (graph theory), and accepts sequences that yield positive net decrease in tour length. The variable-depth strategy permits the algorithm to escape local optima that defeat fixed-k-opt algorithms developed in contexts overseen by scholars at IBM Research and MIT Lincoln Laboratory. Researchers at Carnegie Mellon University and University of Toronto contributed to formalizing selection rules and candidate lists that determine admissible edge exchanges in practical implementations.
Practical implementations leverage data structures and heuristics influenced by software engineering practices at AT&T, Microsoft Research, and Google: nearest-neighbor lists, candidate sets derived from Delaunay triangulation produced by teams at ETH Zurich and Université de Paris, and sparse graph representations used by groups at Los Alamos National Laboratory and Lawrence Livermore National Laboratory. Variants include deterministic and randomized versions, implementations that integrate with metaheuristics from University College London and INRIA such as Simulated Annealing or Genetic Algorithm hybrids, and constrained adaptations used in projects at NASA and Siemens. Notable derivative algorithms include approaches by Keld Helsgaun and enhancements that incorporate candidate selection methods developed at University of New South Wales and University of Melbourne.
The worst-case theoretical complexity relates to exponential exploration similar to analyses by Richard Karp and Garey and Johnson for NP-hard problems, while practical run-time depends on candidate list sizes and stopping criteria adopted in software from Bell Labs and MIT. Performance comparisons with approximation algorithms such as the Christofides algorithm and exact methods like branch-and-cut implemented by teams at Zuse Institute Berlin show that Lin–Kernighan based solvers often achieve near-optimal solutions for large instances encountered by practitioners at FedEx and UPS with manageable computation times. The heuristic's empirical efficacy motivated complexity-theoretic discussions by researchers at Princeton University and Harvard University on the landscape of combinatorial optimization.
The heuristic has been applied in domains researched at Daimler, Toyota Motor Corporation, and General Electric for vehicle routing and logistics; used by Siemens and ABB in industrial scheduling; and integrated into geographic information systems developed by Esri and Google Maps. It has influenced solutions in computational biology explored at Broad Institute and European Bioinformatics Institute for sequencing problems, and in circuit design workflows at Intel and AMD. Academia applied it to problems studied at University of California, San Diego and University of Illinois Urbana-Champaign in network design and robotics path planning.
Empirical assessments were organized through initiatives such as the DIMACS Implementation Challenges and benchmark suites like TSPLIB curated by groups at University of Waterloo and MIT. Comparative studies by teams at INRIA, ETH Zurich, and University of Edinburgh measure solution quality and runtime against methods from Concorde (TSP solver), metaheuristics published by Kirkpatrick, Gelatt, and Vecchi, and exact solvers developed at Rutgers University and Georgia Institute of Technology. Results consistently show Lin–Kernighan implementations, including those by Keld Helsgaun and open-source projects hosted by GitHub contributors, deliver state-of-the-art solutions on large benchmark instances used by industry partners like Amazon and logistics groups at UPS.
Category:Heuristics