Generated by GPT-5-mini| RSA (cryptosystem) | |
|---|---|
| Name | RSA |
| Developer | Ron Rivest, Adi Shamir, Leonard Adleman |
| Introduced | 1977 |
| Type | Public-key cryptosystem |
| Based on | Integer factorization problem |
| Related | Diffie–Hellman key exchange, ElGamal encryption, Paillier cryptosystem |
RSA (cryptosystem) RSA is a public-key cryptography protocol introduced in 1977 by Ron Rivest, Adi Shamir, and Leonard Adleman. It enables asymmetric encryption and digital signature services using number-theoretic operations on large integers, providing confidentiality and authentication in many Internet standards and protocols. RSA's security rests on the difficulty of integer factorization and has influenced research in computational number theory, cryptanalysis, and standards bodies such as Internet Engineering Task Force and National Institute of Standards and Technology.
RSA was announced in a 1977 paper by Rivest, Shamir, and Adleman while at the Massachusetts Institute of Technology. Its appearance followed publications of Diffie–Hellman key exchange by Whitfield Diffie and Martin Hellman (1976) and earlier work on public-key ideas by James H. Ellis, Clifford Cocks, and Malcolm J. Williamson at Government Communications Headquarters—developments later declassified. RSA quickly gained attention in academia and industry, influencing companies like Rivest, Shamir & Adleman (RSA Security) and standards such as Secure Sockets Layer and Transport Layer Security. High-profile implementations and export-control debates involved the U.S. Department of State and cryptography export policy disputes in the 1990s, affecting vendors including Netscape Communications Corporation and organizations like Electronic Frontier Foundation.
RSA is grounded in number theory—particularly properties of prime numbers, modular arithmetic, and Euler's theorem. Key components include two large distinct primes p and q (often generated using algorithms such as those attributed to Miller–Rabin and Baillie–PSW), their product n = pq, and Euler's totient φ(n) = (p−1)(q−1). The public exponent e is chosen coprime to φ(n), and the private exponent d satisfies d ≡ e^{−1} (mod φ(n)). Correctness derives from Euler's theorem and the Chinese remainder theorem, with optimizations exploiting CRT for faster computation. Hardness assumptions reduce to the integer factorization problem studied in contexts like the General Number Field Sieve and analytic results from researchers including John Pollard and Carl Pomerance.
Key generation begins with selecting primes p and q using probabilistic primality tests developed by Gary Miller and Michael O. Rabin (the Miller–Rabin test) and deterministic tests like Agrawal–Kayal–Saxena for special cases. Compute n = pq and φ(n), pick e such as 65537 (a choice associated with Friedrich-L. Bauer and common practice), and compute d as the modular inverse via the extended Euclidean algorithm. Encryption of a message m (with padding schemes) computes c ≡ m^e (mod n), while decryption computes m ≡ c^d (mod n), often accelerated with CRT using p and q. Real-world use requires padding and formatting standards from PKCS#1, with influences from IETF and ISO/IEC committees.
Security relies primarily on the intractability of factoring n into p and q; advances in factoring algorithms such as the Quadratic Sieve and General Number Field Sieve directly impact key-size recommendations from NIST and ENISA. Cryptanalytic attacks include direct integer factorization, low-exponent attacks (e.g., Hastad's broadcast attack), adaptive chosen-ciphertext attacks exemplified by Bleichenbacher's attack on PKCS#1 v1.5, and side-channel attacks exploiting implementations (timing, power analysis) explored by researchers like Paul Kocher and Daniel J. Bernstein. Quantum algorithms, notably Shor's algorithm proposed by Peter Shor, threaten RSA by enabling polynomial-time factorization on a sufficiently large universal quantum computer, prompting migration work by organizations like National Quantum Initiative.
Implementations appear in libraries and products such as OpenSSL, LibreSSL, GnuTLS, and hardware security modules from vendors including Thales Group and Entrust. Performance considerations drive choices of key size (2048-bit and 4096-bit common), exponent e (small values like 65537 for speed), and use of CRT to accelerate decryption. Software optimizations use fast multiplication algorithms (Karatsuba, Toom–Cook, and Schönhage–Strassen algorithm) and assembly-level implementations for RSA primitives in processors from Intel and ARM. Compliance and interoperability are governed by standards like PKCS#1, X.509, and protocols such as HTTPS and S/MIME.
Numerous variants and extensions adapt RSA for different goals: signature schemes with appendix (RSASSA-PSS) standardized to provide provable security against chosen-message attacks; deterministic constructions for signature verification in EMSA; multi-prime RSA using more than two primes to speed decryption; and threshold RSA enabling distributed key shares studied in threshold cryptography literature by Adrian Perrig and others. Hybrid schemes combine RSA with symmetric ciphers (e.g., AES) in protocols standardized by IETF to balance key distribution and bulk encryption performance. Post-quantum research, driven by groups at NIST and universities like MIT and University of Waterloo, investigates alternatives to RSA such as lattice-based and code-based cryptosystems.