Generated by GPT-5-mini| homomorphic encryption | |
|---|---|
| Name | Homomorphic encryption |
| Introduced | 2009 (fully homomorphic concept) |
| Field | Cryptography |
| Related | Enigma machine, RSA (cryptosystem), Diffie–Hellman key exchange |
homomorphic encryption is a class of cryptographic techniques that enable computation on encrypted data without revealing plaintext to the computing party. It builds on decades of work in public-key cryptography, lattice-based constructions, and computational hardness assumptions to allow operations such as addition and multiplication to be performed on ciphertexts; the decrypted result matches the operation applied to the underlying plaintexts. Development of these techniques intersects with major research programs and institutions including IBM, Microsoft Research, MIT, Stanford University, and University of California, Berkeley.
Early conceptual groundwork for computing on concealed data drew on ideas from pioneers like Claude Shannon and later practical public-key schemes such as RSA (cryptosystem) and Diffie–Hellman key exchange. Work in the 1970s and 1980s at places like Bell Labs and MIT on secure multiparty computation and privacy-preserving protocols influenced subsequent research at IACR conferences and workshops. A breakthrough proof-of-concept that enabled arbitrary computation on encrypted data catalyzed activity at IBM Research and academic labs at Harvard University and Princeton University, and led to rapid developments across projects at Microsoft Research and the National Institute of Standards and Technology. Follow-on advances engaged cryptographers associated with ETH Zurich, University of Toronto, Columbia University, University of Washington, and European centers such as École Polytechnique Fédérale de Lausanne.
Foundational concepts draw from primitives and results in lattice-based cryptography, ring theory, and complexity assumptions linked to problems like the Shortest Vector Problem and the Learning With Errors problem. Core definitions formalize properties such as semantic security in the style of Goldwasser–Micali and security reductions to hardness results considered by researchers at Princeton University and Cornell University. Schemes are classified by their algebraic capabilities (e.g., supporting addition or multiplication) and by formal models developed in publications from IACR proceedings and texts associated with Stanford University Press. Definitions of ciphertexts, keys, and evaluation functions reference constructions introduced in papers from labs including IBM, Microsoft Research, and University of California, Berkeley.
Multiple families of schemes emerged from academic teams at IBM Research, Microsoft Research, MIT, and UC Berkeley. Partially homomorphic schemes include early practical systems akin to designs derived from RSA (cryptosystem) variants and schemes influenced by Paillier cryptosystem developed by researchers with ties to École Polytechnique and Université catholique de Louvain. Somewhat homomorphic schemes were advanced in publications from Stanford University and Harvard University, while leveled and fully homomorphic constructions were proposed and optimized by groups at IBM Research and Microsoft Research. Specific implementations and optimizations have been developed by projects such as teams at Duality Technologies, Zama, Enveil, Google research groups, and open-source libraries maintained by collaborations including MIT and ETH Zurich.
Security analyses rely on reductions to worst-case hardness assumptions studied by scholars from Princeton University, Columbia University, and UC Berkeley. Central mathematical foundations include lattices linked to the Learning With Errors problem and number-theoretic assumptions examined by researchers at Stanford University and Harvard University. Proof techniques utilize methodologies from Goldwasser–Micali style semantic security and are peer-reviewed through venues like CRYPTO and EUROCRYPT where contributors from ETH Zurich, Cornell University, and University of Waterloo publish. Cryptanalytic work from groups at NIST and academic centers such as University of Cambridge informs parameter selection and threat models, while side-channel analyses have been performed by teams at University of California, Santa Barbara and Nanyang Technological University.
Implementations have been produced by industry labs and academic consortia including IBM Research, Microsoft Research, Google, Duality Technologies, and open-source projects associated with MIT and Zama. Performance engineering draws on optimizations from high-performance computing teams at Argonne National Laboratory and compiler research at Carnegie Mellon University. Benchmarks and toolkits reported at meetings hosted by IACR and NIST compare throughput, latency, and memory trade-offs; contributors include scientists from Intel Corporation, NVIDIA, ARM Holdings, and universities such as University of Illinois Urbana–Champaign and University of Toronto. Hardware acceleration proposals cite work from Intel labs and collaborations with Lawrence Livermore National Laboratory and Sandia National Laboratories.
Use cases span data analytics, privacy-preserving machine learning, secure outsourcing, and regulated data processing with pilot projects at IBM, Microsoft, Google, Amazon Web Services, and startups like Enveil and Duality Technologies. Domain-specific deployments have been explored in healthcare research deployments at Mayo Clinic, Johns Hopkins University, and Massachusetts General Hospital; finance experiments at institutions including JPMorgan Chase and Goldman Sachs; and governmental pilots involving agencies such as National Aeronautics and Space Administration and Department of Homeland Security. Collaborative projects at CERN and climate modeling groups at NOAA and NASA investigate secure computation on sensitive scientific datasets. Standards activity and interoperability work involves bodies such as NIST and industry consortia including IETF and IEEE working groups.
Main challenges include computational overhead, parameter selection, and integration with legal and compliance frameworks involving regulators such as European Commission and standards bodies like NIST. Research directions pursued at MIT, Stanford University, ETH Zurich, and University of California, Berkeley focus on reducing noise growth, improving key management used by organizations such as Microsoft and IBM, and hybrid architectures combining secure enclaves from Intel with homomorphic techniques. Future work includes hardware-software co-design advocated by teams at Lawrence Berkeley National Laboratory and algorithmic improvements proposed at conferences like CRYPTO and EUROCRYPT to broaden practical adoption across sectors exemplified by pilots at Mayo Clinic and JPMorgan Chase.