LLMpediaThe first transparent, open encyclopedia generated by LLMs

AKS primality test

Generated by DeepSeek V3.2
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 62 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted62
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
AKS primality test
NameAKS primality test
ClassPrimality test
DataInteger
TimeÕ(n6)
SpaceÕ(n3)
AuthorsManindra Agrawal, Neeraj Kayal, Nitin Saxena
Year2002
JournalAnnals of Mathematics

AKS primality test. The AKS primality test is a deterministic algorithm for determining whether a given integer is a prime number. First published in a seminal 2002 paper in the Annals of Mathematics by computer scientists Manindra Agrawal, Neeraj Kayal, and Nitin Saxena, it was the first algorithm of its kind proven to run in polynomial time for all inputs. This breakthrough resolved a long-standing open problem in computational number theory and theoretical computer science, demonstrating that the problem of primality testing is in the complexity class P.

Overview

The algorithm operates by checking a specific condition based on modular arithmetic and the properties of polynomial rings over finite fields. Unlike probabilistic tests such as the Miller–Rabin primality test or the Solovay–Strassen primality test, the AKS test always provides a correct, verifiable answer without relying on unproven hypotheses like the generalized Riemann hypothesis. Its core mathematical insight involves a refined version of a theorem attributed to the French mathematician Augustin-Louis Cauchy, applied within the framework established by the Indian mathematician Srinivasa Ramanujan in his work on congruences.

History and development

The quest for an efficient, general-purpose deterministic primality test was a major goal throughout the late 20th century. Prior to 2002, algorithms like the Adleman–Pomerance–Rumely primality test and the Elliptic curve primality proving method were deterministic but not proven to run in polynomial time for all cases. The development began at the Indian Institute of Technology Kanpur, where Agrawal, then a professor, supervised the undergraduate research project of Kayal and Saxena. Their initial manuscript, famously posted online in 2002, sent shockwaves through the global communities of algorithm design and number theory, leading to rapid verification by experts at institutions like the Massachusetts Institute of Technology and the University of California, Berkeley.

Algorithm

Given an input integer n > 1, the algorithm follows a series of deterministic steps. First, it checks if n is a perfect power; if so, it is composite. Next, it finds the smallest r such that the multiplicative order of n modulo r is greater than a bound based on log²n. Then, for all a from 1 to a certain limit involving Euler's totient function φ(r) and log n, it verifies a crucial polynomial congruence in the ring (Z/nZ)[x] modulo xr - 1. If all congruences hold, n is prime; otherwise, it is composite. This procedure avoids the exhaustive searches characteristic of earlier methods like the Sieve of Eratosthenes.

Proof of correctness

The proof hinges on two main lemmas establishing necessary and sufficient conditions for primality. It leverages deep results from the analytic number theory of Yuri Linnik concerning the distribution of primes in arithmetic progressions. A central component involves demonstrating that if the polynomial congruence holds for the selected parameters, then n must be a prime power, and the initial perfect power check eliminates composite cases. The final argument, refined by researchers such as Carl Pomerance and Hendrik Lenstra, ties these number-theoretic facts to the algorithm's steps, ensuring no composite number can pass all tests.

Complexity and efficiency

The original analysis proved the algorithm runs in Õ(n12) time, where n is the input size in bits. Subsequent improvements by Daniel J. Bernstein and others, utilizing faster algorithms for polynomial multiplication and modular exponentiation, reduced the bound to Õ(n6). While this is polynomial, it is vastly slower for practical purposes than the near-linear heuristic performance of the Miller–Rabin primality test or the sophisticated Elliptic curve primality proving method. Thus, its primary significance remains theoretical, providing an important boundary in the computational complexity theory of number-theoretic problems.

Significance and impact

The discovery was a landmark event, often compared to other major results in theoretical computer science like the resolution of the P versus NP problem for a specific domain. It earned its authors the prestigious Gödel Prize in 2006 and the Fulkerson Prize, highlighting its profound implications for discrete mathematics and algorithm design. The result solidified the position of the Indian Institute of Technology Kanpur as a leading center for research in computational complexity and inspired new avenues in the study of pseudoprimes and derandomization. While not used in practical applications like cryptography in systems such as RSA (cryptosystem), it stands as a definitive answer to a problem posed by the ancient Greek mathematician Euclid and studied by figures from Pierre de Fermat to Leonhard Euler.

Category:Primality tests Category:Deterministic algorithms Category:Computational number theory Category:Theoretical computer science