Generated by DeepSeek V3.2| P versus NP problem | |
|---|---|
| Name | P versus NP problem |
| Field | Computational complexity theory |
| Conjectured by | Stephen Cook, Leonid Levin |
| Year | 1971 |
| Equivalent to | Boolean satisfiability problem |
| Consequences | Cryptography, Operations research, Artificial intelligence |
P versus NP problem. It is a major unsolved question in theoretical computer science and one of the seven Millennium Prize Problems selected by the Clay Mathematics Institute. Informally, it asks whether every problem whose solution can be verified quickly by a computer can also be solved quickly. A resolution would have profound consequences for mathematics, cryptography, and our understanding of computation.
The class P (complexity) contains all decision problems that can be solved by a deterministic Turing machine in polynomial time. The class NP (complexity) contains problems for which a proposed solution, or certificate, can be verified in polynomial time. The P versus NP problem asks whether these two classes are identical, that is, whether P = NP. A classic example of an NP-complete problem, which is in NP and is as hard as any problem in NP, is the Boolean satisfiability problem.
Proving P = NP would imply that many difficult problems, from protein folding to logistics planning, have efficient solutions, revolutionizing fields like operations research, artificial intelligence, and biology. Conversely, a proof that P ≠ NP would confirm that for many practical problems, no efficient algorithm exists, providing a fundamental justification for the use of heuristic methods and approximation algorithms. The security of much of modern cryptography, including systems like RSA (cryptosystem), relies on the assumption that P ≠ NP, as integer factorization is in NP.
Formally, a language L is in P if there exists a deterministic Turing machine M and a polynomial p such that M decides L in time O(p(n)). A language L is in NP if there exists a nondeterministic Turing machine N and a polynomial q such that N accepts L in time O(q(n)). Equivalently, L is in NP if there exists a polynomial-time verifier Turing machine V and a polynomial r such that for every string x in L, there exists a certificate y with |y| ≤ r(|x|) and V accepts the pair (x, y). The concept of polynomial-time reduction is used to define NP-hardness.
The problem was formally articulated in 1971 by Stephen Cook in his seminal paper "The Complexity of Theorem-Proving Procedures" and independently by Leonid Levin in the Soviet Union. Cook introduced the notion of NP-completeness by proving the Cook–Levin theorem, which states that the Boolean satisfiability problem is NP-complete. This was soon followed by Richard Karp's 1972 paper "Reducibility Among Combinatorial Problems," which identified 21 other NP-complete problems. Despite intense effort by researchers like Michael Sipser, László Babai, and Scott Aaronson, and the offering of a $1 million prize by the Clay Mathematics Institute, the problem remains open.
Research has yielded important insights into the structure of complexity classes. The P = NP problem is known to be equivalent to the existence of a polynomial-time algorithm for specific NP-complete problems like the traveling salesman problem or the subset sum problem. The circuit complexity approach, studied by researchers like Alexander Razborov, has shown certain limitations of specific proof techniques. The polynomial hierarchy would collapse to its first level if P = NP, a result tied to work by Richard Karp and Larry Stockmeyer. Furthermore, results in descriptive complexity by scholars like Ronald Fagin have shown that NP corresponds precisely to the class of problems expressible in existential second-order logic.
The P versus NP problem has permeated popular culture, often symbolizing the ultimate intellectual challenge. It was featured in an episode of the CBS series *Elementary*, where a character sought to solve it. The problem is a central plot device in the Turkish film The Butterfly's Dream. References also appear in the webcomic xkcd by Randall Munroe and in the novel The Millennium Problems by Keith Devlin. The concept is frequently used in discussions about artificial intelligence in media, such as in articles by *Wired*.
Category:Computational complexity theory Category:Unsolved problems in computer science Category:Millennium Prize Problems