Generated by GPT-5-mini| pseudorandomness | |
|---|---|
| Name | Pseudorandomness |
| Field | Computer science, Mathematics, Cryptography |
| Introduced | 20th century |
| Notable | Alan Turing, John von Neumann, Claude Shannon, Donald Knuth, Andrew Yao, Oded Goldreich, Silvio Micali, Mihir Bellare, Shafi Goldwasser, Ronald Rivest, Adi Shamir, Leonard Adleman, Whitfield Diffie, Martin Hellman |
pseudorandomness Pseudorandomness refers to deterministic processes or sequences that exhibit statistical properties similar to those produced by truly random sources. It arises at the intersection of Alan Turing-era computation theory, John von Neumann's numerical methods, and Claude Shannon's information theory, and underpins modern cryptographic protocols and randomized algorithms developed by researchers like Andrew Yao and Oded Goldreich.
The concept formalizes when a generator or object is indistinguishable from a random counterpart by efficient distinguishers such as models from Alan Turing-style computation or complexity classes like Stephen Cook's classes and Leonid Levin-related concepts; it connects to definitions by Shafi Goldwasser and Silvio Micali on unpredictability, to Andrew Yao's next-bit test, and to entropy notions developed by Claude Shannon and Andrey Kolmogorov. Canonical notions include pseudorandom sequences, pseudorandom functions, and pseudorandom permutations studied by researchers including Mihir Bellare, Ronald Rivest, Adi Shamir, Oded Goldreich, and Silvio Micali. Other influential figures and institutions include Donald Knuth, Whitfield Diffie, Martin Hellman, Leonard Adleman, National Institute of Standards and Technology, European Organization for Nuclear Research, Massachusetts Institute of Technology, Stanford University, University of California, Berkeley, Princeton University, Harvard University, University of Cambridge, University of Oxford, Weizmann Institute of Science, Technion – Israel Institute of Technology, École Normale Supérieure, Institut des Hautes Études Scientifiques.
Mathematical foundations draw on complexity theory by Stephen Cook, Richard Karp, Juraj Hromkovič-style combinatorics, and probabilistic method techniques advanced by Paul Erdős, Alfréd Rényi, and Noga Alon. Foundational hardness assumptions relate to problems like integer factorization studied by Ronald Rivest, Adi Shamir, Leonard Adleman, discrete logarithm contexts in work by Whitfield Diffie and Martin Hellman, lattice problems linked to Miklós Ajtai and Chris Peikert, and coding-theory hardness from Vladimir Levenshtein-adjacent research. Information-theoretic metrics originate with Claude Shannon and computational entropy pursued by Andrey Kolmogorov and later expanded by Oded Goldreich and Salil Vadhan. Complexity classes and reductions cite Michael Sipser, Leslie Valiant, Kurt Gödel-inspired recursion theory, and pseudorandomness ties to derandomization programs by Noam Nisan and David Zuckerman.
Pseudorandom generators (PRGs) and pseudorandom functions (PRFs) were formalized through constructions and proofs by Andrew Yao, Oded Goldreich, Silvio Micali, Shafi Goldwasser, Mihir Bellare, and Ronald Rivest. Constructions often assume hardness of problems studied by Peter Shor, Daniel Shanks, and lattice researchers like Miklós Ajtai and Oded Goldreich collaborators. Practical PRFs include block cipher-based designs from Horst Feistel-inspired networks and standards by NIST advocates; theoretical PRGs engage with derandomization results from Noam Nisan, David Zuckerman, László Babai, Avi Wigderson, Russell Impagliazzo, and Mihir Bellare.
Measures include statistical tests and indistinguishability against adversaries in complexity classes such as NP and BPP referenced by Stephen Cook and Michael Sipser; practical test suites originate from standards bodies like National Institute of Standards and Technology and academic toolkits influenced by Donald Knuth's randomness tests. Key analytic tools come from Fourier analysis on Boolean functions developed by Michoel Gromov-adjacent researchers, concentration inequalities from Paul Erdős collaborators, and spectral techniques used by Alon Yuster-related graph theorists. Notable test paradigms include Andrew Yao's next-bit test, linear complexity measures by algebraists, and unpredictability notions from Shafi Goldwasser and Silvio Micali.
Pseudorandomness powers encryption schemes from Ronald Rivest, Adi Shamir, and Leonard Adleman's public-key work; key exchange protocols from Whitfield Diffie and Martin Hellman; and practical systems developed at National Security Agency-affiliated research and academic labs at MIT, Stanford University, and University of California, Berkeley. Randomized algorithms benefiting include primality tests influenced by Gary Miller and Manindra Agrawal (AKS), hashing schemes by Donald Knuth and Robert Sedgewick, Monte Carlo methods used in computational physics at CERN, and streaming algorithms inspired by Michael Mitzenmacher and Edo Liberty. Pseudorandom constructions are essential for zero-knowledge proofs from Shafi Goldwasser and Silvio Micali, multiparty computation research by Oded Goldreich and Moni Naor, and blockchain protocols studied by Satoshi Nakamoto-related research communities.
Techniques include hardness amplification from Russell Impagliazzo and Avi Wigderson, extractors by Noam Nisan and David Zuckerman, and reductions in the style of Stephen Cook and Richard Karp. Complexity-theoretic barriers and results relate to derandomization programs by László Babai, Noam Nisan, Avi Wigderson, and hardness assumptions studied by Peter Shor and Miklós Ajtai. Algebraic methods employ finite field theory linked to work by Évariste Galois-historical lineage and coding-theory constructions advanced by Robert Gallager and Elias Peter Berlekamp. Recent progress connects to quantum-resistant constructions researched by Peter Shor, Lov Grover, and lattice-based proponents such as Chris Peikert.
Implementations range from software libraries at Google and Microsoft Research to hardware RNG replacements in devices designed by Intel Corporation and ARM Holdings, and standardization by National Institute of Standards and Technology and Internet Engineering Task Force. Limitations arise from side-channel vulnerabilities studied by Paul Kocher and Dan Boneh, seed-management failures documented in incidents involving Linux and OpenSSL projects, and theoretical constraints like impossibility results tied to work by Michael Sipser and Juraj Hromkovič. Ongoing research spans post-quantum resilience pursued by National Institute of Standards and Technology efforts and industry groups at European Telecommunications Standards Institute.