Generated by GPT-5-mini| Quantum computation | |
|---|---|
| Name | Quantum computation |
| Field | Physics, Computer science |
| Invented by | Paul Benioff, Yuri Manin, Richard Feynman, David Deutsch |
| Year | 1980s |
| Institutions | IBM, Google, Microsoft, Intel, Rigetti Computing, D-Wave Systems, Xanadu (company), University of Oxford, Massachusetts Institute of Technology, California Institute of Technology, University of Cambridge, Stanford University, University of Waterloo, University of Tokyo, ETH Zurich |
Quantum computation Quantum computation is the study of computational systems that use quantum-mechanical phenomena to process information. It combines ideas from Quantum mechanics, Computer science, Mathematics, and Electrical engineering to propose models and devices that can outperform classical counterparts for specific problems. Research spans theoretical frameworks, algorithm design, device fabrication, error correction, and applications in chemistry, cryptography, and optimization.
Quantum computation arose in the 1980s when researchers at institutions such as Massachusetts Institute of Technology, California Institute of Technology, Stanford University, and University of Cambridge explored how principles from Quantum mechanics could change computing. Key early contributors included Paul Benioff, Yuri Manin, Richard Feynman, and David Deutsch who proposed foundational models and thought experiments. Development accelerated with proposals like the Shor's algorithm discovery context at AT&T Bell Laboratories and proofs of universality from groups at University of Oxford and University of Waterloo. Modern industrial players such as IBM, Google, Microsoft, Intel, D-Wave Systems, and startups including Rigetti Computing and Xanadu (company) pursue hardware and software milestones alongside national laboratories like Lawrence Berkeley National Laboratory and Los Alamos National Laboratory.
Quantum computation uses quantum bits or qubits realized in systems studied in Quantum mechanics and Condensed matter physics. Theoretical foundations draw on work from John von Neumann in Mathematics, Paul Dirac in Quantum theory, and formal models by David Deutsch and Richard Feynman. Core concepts include quantum superposition and quantum entanglement as formalized by researchers at University of Oxford and Princeton University, while complexity-theoretic classifications reference results from Peter Shor and Lov Grover with connections to Stephen Cook and Leonid Levin in Computational complexity theory. Quantum circuit models, quantum Turing machines, and quantum walks invoke formalism developed across Massachusetts Institute of Technology and University of Cambridge groups. Theoretical advances from Peter Shor, Lov Grover, Alexei Kitaev, and John Preskill inform complexity classes such as BQP, with continued contributions from researchers at Harvard University, Yale University, and University of California, Berkeley.
Quantum algorithms exploit quantum interference and entanglement. Landmark algorithms include Shor's algorithm for integer factorization and discrete logarithms, developed in the context of AT&T Bell Laboratories and influential to RSA (cryptosystem)-related concerns at cryptographic communities; Grover's algorithm from researchers at Bell Labs provides quadratic speedup for unstructured search. Algorithms for Hamiltonian simulation and chemistry originate in work at Harvard University, California Institute of Technology, and ETH Zurich and are used for electronic structure problems studied at Argonne National Laboratory. Quantum phase estimation and amplitude amplification techniques appear in curricula at Massachusetts Institute of Technology and Stanford University. More recent algorithmic frameworks, such as variational quantum eigensolvers and quantum approximate optimization algorithms, were proposed by teams at IBM, Harvard University, and Google for near-term devices. Complexity separations and lower bounds continue to be explored by researchers at Princeton University, University of Waterloo, and University of Cambridge.
Qubits have been implemented using diverse physical systems studied in laboratories at IBM, Google, Microsoft, Intel, D-Wave Systems, Rigetti Computing, University of Oxford, University of California, Santa Barbara, and National Institute of Standards and Technology. Platforms include superconducting circuits developed at IBM and Google, trapped ions advanced at University of Maryland and University of Innsbruck, photonic qubits pursued by Xanadu (company) and groups at University of Vienna, spin qubits in semiconductor devices from Intel and University of New South Wales, and topological qubits researched by teams at Microsoft and Microsoft Research. Other approaches involve neutral atoms studied at Harvard University and University of Chicago, nitrogen-vacancy centers investigated at University of California, Berkeley and Duke University, and quantum annealing hardware built by D-Wave Systems in collaboration with institutions such as Los Alamos National Laboratory. Cryogenic infrastructure and dilution refrigerators come from partnerships with companies like BlueFors and research centers including Lawrence Livermore National Laboratory.
Quantum error correction theory, with formative contributions from Peter Shor, Andrew Steane, Alexei Kitaev, and Daniel Gottesman, enables protection of quantum information in noisy devices studied at California Institute of Technology and Massachusetts Institute of Technology. Codes such as the surface code were developed in collaborations involving Microsoft Research and University of California, Santa Barbara, while stabilizer formalism traces to work at Princeton University and Harvard University. Fault-tolerant threshold theorems and concatenated code strategies were advanced by researchers at University of Cambridge and University of Oxford. Experimental demonstrations of error correction and logical qubits have been reported by teams at IBM, Google, University of Science and Technology of China, and Yale University. Ongoing roadmap efforts coordinate contributions from national initiatives like Quantum Economic Development Consortium and agencies such as National Science Foundation and European Commission.
Potential applications include cryptanalysis relevant to RSA (cryptosystem) and Elliptic Curve Cryptography concerns studied by security groups at National Institute of Standards and Technology and European Telecommunications Standards Institute; molecular simulation for pharmaceuticals researched at Pfizer and GlaxoSmithKline in partnership with IBM and Google; optimization for logistics problems engaged by Volkswagen and Airbus; and machine learning pipelines investigated at Google and Microsoft Research. Economic and strategic implications drive national strategies at United States Department of Defense, European Commission, National Institute of Standards and Technology, and research investment by corporations like IBM, Google, Intel, and Microsoft. Academic impact is reflected in curricula and centers at Massachusetts Institute of Technology, University of Cambridge, Stanford University, and University of Waterloo, while open-source ecosystems from IBM and Google foster community development.