Generated by GPT-5-mini| Goldbach conjecture | |
|---|---|
| Name | Goldbach conjecture |
| Caption | Christian Goldbach |
| Field | Number theory |
| Proposed | 1742 |
| Conjectured by | Christian Goldbach |
| Status | Open problem |
Goldbach conjecture is a central unsolved problem in Number theory asserting that every even integer greater than two is the sum of two prime numbers. The conjecture has motivated work across mathematics including Analytic number theory, Additive combinatorics, Computational number theory, and has connections with results attributed to figures such as Leonhard Euler, Ivan Vinogradov, G. H. Hardy, and John Littlewood. Efforts involve methods from the Prime number theorem, Sieve theory, Fourier analysis (mathematics), and extensive computations by teams using supercomputers at institutions like IBM, Fermilab, and national laboratories.
The conjecture states: every even integer n ≥ 4 can be expressed as p + q where p and q are primes. Variants include the "strong" form above and a "weak" form stating that every odd integer greater than 5 equals the sum of three primes. The statement connects to classical objects such as Prime (mathematics), Even and odd numbers, and the distribution results exemplified by the Prime number theorem and Riemann zeta function. Equivalent formulations and quantitative refinements involve representations counting functions and related sums studied in Additive number theory and Probabilistic number theory.
The conjecture originated in a 1742 letter from Christian Goldbach to Leonhard Euler proposing that every integer greater than 2 is a sum of three primes; Euler reformulated this into the two-prime version for even numbers. Subsequent centuries saw contributions by mathematicians associated with institutions such as the Royal Society, Prussian Academy of Sciences, and universities including University of Göttingen. Notable historical milestones include techniques from Dirichlet and Riemann in analytic methods, progress due to the Circle method developed by G. H. Hardy and John Littlewood, and modern advances by figures such as Ivan Vinogradov, Vaughan, R. C., Goldston, D. A., and Yitang Zhang. Political and institutional contexts included support through grants from agencies like the National Science Foundation and computation projects at centers like the Max Planck Institute.
Vinogradov proved in the 1930s that sufficiently large odd numbers are sums of three primes, using what became known as the Vinogradov's theorem and the Hardy–Littlewood circle method. Results conditional on hypotheses such as the Generalized Riemann Hypothesis yield proofs of density versions and bounds for exceptional sets; work by J. M. Deshouillers, H. Iwaniec, Henryk Iwaniec, and R. C. Vaughan refined these. The weak Goldbach problem was effectively resolved via combined efforts by Helfgott, Harald leveraging refinements of the circle method and computational verification up to large bounds. Results on almost-primes, using Sieve of Eratosthenes techniques and modern sieve theory by Atle Selberg, Enrico Bombieri, and Daniel Goldston, established representations where primes are replaced by numbers with few prime factors. Theorems by Chen Jingrun show every sufficiently large even integer is the sum of a prime and a product of at most two primes (a so-called Chen prime result), relying on deep Sieve theory.
Extensive verifications have checked the conjecture up to very large bounds using clusters and supercomputers at companies and institutions like Google, CRAY Research, Lawrence Livermore National Laboratory, and universities such as Princeton University and Massachusetts Institute of Technology. Projects employed optimized primality tests including the Miller–Rabin primality test, deterministic variants, and sieving algorithms derived from work by Atkin and Bernstein and implementations using FAST Fourier transform accelerations. Distributed computing efforts engaged communities similar to those in Great Internet Mersenne Prime Search and leveraged libraries influenced by GNU Multiple Precision Arithmetic Library and standards from IEEE. Numerical results up to trillions confirm every even integer in tested ranges splits as two primes, reducing the size of any possible counterexample but not constituting a proof in classical sense.
Major analytic approaches rely on the Hardy–Littlewood circle method, estimates for trigonometric sums from Weyl, and bounds for exponential sums pioneered by Vinogradov and refined by Bourgain, Jean-Pierre and Tao, Terence. Sieve theoretic frameworks follow lines of Selberg sieve, Large sieve, and modern combinatorial sieves inspired by Green, Ben and Tao, Terence in additive combinatorics. Connections to the Riemann zeta function and hypotheses like the Riemann hypothesis and Generalized Riemann Hypothesis provide conditional routes where zero-free regions and zero-density estimates impact prime distribution. Probabilistic heuristics from Cramér, Harald and statistical models of primes inform expected counts given by Hardy–Littlewood conjectures (prime k-tuples). Computational number theory techniques, algorithmic number theory from researchers at École Polytechnique, and collaborative infrastructures have been essential for pushing verification limits.
The conjecture relates to the Twin prime conjecture, Polignac's conjecture, and the family of Hardy–Littlewood conjectures on prime k-tuples. Consequences of a proof would influence results in Additive combinatorics, Cryptography (via assumptions about prime gaps), and structural insights into the distribution of primes akin to breakthroughs associated with Green–Tao theorem on arithmetic progressions. Conditional implications tie to the Generalized Riemann Hypothesis and potential refinements of the Prime number theorem error terms. Partial analogues exist for primes in arithmetic progressions linked to Dirichlet's theorem on arithmetic progressions and for algebraic integers in settings studied by researchers at institutions like Institute for Advanced Study and Cambridge University.