LLMpediaThe first transparent, open encyclopedia generated by LLMs

MPC

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 61 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted61
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
MPC
NameMPC
TypeCryptographic protocol suite
Founded1980s
FieldsCryptography, Computer Science, Information Security
Notable worksYao's Garbled Circuits, Shamir Secret Sharing, GMW Protocol

MPC

MPC is a cryptographic paradigm enabling multiple parties to compute a function over their private inputs while keeping those inputs confidential. It intersects with work by researchers at Bell Labs, MIT, Harvard University, Microsoft Research, and IBM Research, and builds on foundational results from Yao's Millionaires' Problem, Shamir, and the GMW protocol. Practical deployments reference collaborations among DARPA, EU Horizon 2020, World Bank, Facebook, and Goldman Sachs.

Definition and terminology

Multiparty computation denotes protocols in which participants such as organizations, companies, banks, hospitals, and research labs jointly evaluate a computation without revealing their private data. Key terminology includes terms coined in papers from Andrew Yao, Adi Shamir, Oded Goldreich, and Silvio Micali: honest-but-curious (passive) adversaries from models used in work at IBM Research, and malicious (active) adversaries as treated in protocols by teams at ETH Zurich and RSA Laboratories. Other standard terms derive from the Universal Composability framework developed by groups at MIT and Weizmann Institute, and cryptographic primitives like secret sharing from Shamir and threshold schemes from Claude Shannon-inspired information-theoretic studies.

History and development

Origins trace to Yao's Millionaires' Problem and the seminal 1980s papers by Andrew Yao and contemporaries at Bell Labs and Stanford University. The 1980s and 1990s saw the emergence of protocols such as the GMW protocol and Shamir's secret sharing, with security formulations advanced by scholars at RSA Laboratories, Technion, and Princeton University. In the 2000s, efficiency improvements integrated techniques from homomorphic encryption research at Microsoft Research and IBM Research, while applied deployments were motivated by initiatives from DARPA and the European Commission. Recent years have combined advances from Google's privacy research, Facebook's data collaboration projects, and the open-source work of academic groups at ETH Zurich and University of California, Berkeley.

Mathematical foundations and algorithms

Foundations use algebraic structures and complexity-theoretic assumptions studied at Princeton University, Stanford University, Harvard University, and University of Cambridge. Core primitives include secret sharing schemes like Shamir's polynomial interpolation over finite fields, and garbled circuits from Andrew Yao's work, with composition theorems formalized in the Universal Composability framework by researchers at MIT and Weizmann Institute. Protocol families rely on hardness assumptions such as factoring and discrete logarithm problems analyzed by teams at RSA Laboratories and École Polytechnique Fédérale de Lausanne; lattice-based alternatives draw on research at NIST and Google for post-quantum resilience. Algorithms cover oblivious transfer protocols derived from work at IBM Research and Microsoft Research, zero-knowledge proofs popularized by scholars at Zcash-affiliated labs and Princeton University, and preprocessing models influenced by contributions from École normale supérieure and University of Tartu.

Applications and use cases

Use cases span finance, healthcare, and governance with pilots involving institutions like Goldman Sachs, JPMorgan Chase, World Bank, Centers for Disease Control and Prevention, and National Institutes of Health. Privacy-preserving analytics enable consortiums such as those formed by Pfizer and academic hospitals, while secure auctions and benchmarking draw interest from NASDAQ and European Central Bank. Data-sharing initiatives involving UNICEF and UN agencies explore secure statistics; identity and credential systems leverage standards work at IETF and implementations tested by Mozilla and Google. Machine learning collaborations reference models trained across datasets from Stanford Medicine, Mayo Clinic, and consortiums connected to Horizon 2020 projects.

Implementations and software

Open-source and commercial stacks emerged from academic labs and corporate R&D. Prominent projects include libraries associated with Microsoft Research and the OpenMined community, frameworks developed at ETH Zurich and University of California, Berkeley, and enterprise products from Zama-affiliated teams and R3 in financial consortia. Toolkits implement primitives such as garbled circuits, secret sharing, and oblivious transfer; language extensions and compilers have been produced by groups at Carnegie Mellon University and École Polytechnique. Testbeds for benchmarking reference platforms from NIST and deployment scenarios evaluated in pilot programs by DARPA and European Commission grant recipients.

Security, privacy, and limitations

Security proofs rely on models formalized by researchers at MIT, Weizmann Institute, and Stanford University; adversarial models include active adversaries studied at ETH Zurich and passive adversaries addressed in early work at IBM Research. Limitations stem from communication complexity identified in studies at University of Cambridge and computation overhead noted by teams at Microsoft Research and Google. Side channels examined by scholars at University of California, Berkeley and Princeton University challenge practical confidentiality, while legal and regulatory constraints from bodies like European Commission and U.S. Department of Health and Human Services affect cross-border deployments. Quantum threats motivate post-quantum MPC work coordinated through NIST and university cryptography groups.

Future directions and research challenges

Research agendas involve scaling protocols to thousands of parties, interoperability with blockchain systems studied by Consensys and Hyperledger, and integration with differential privacy frameworks advanced at Apple and Google. Open challenges include reducing latency as investigated at ETH Zurich and Stanford University, formal verification of implementations from teams at Carnegie Mellon University, and economic incentives analyzed by researchers at University of Chicago and London School of Economics. Continued collaboration among academia, industry, and policymakers at European Commission, DARPA, and NIST is expected to shape adoption and standardization.

Category:Cryptography