Generated by GPT-5-mini| quantum teleportation | |
|---|---|
| Name | Quantum teleportation |
| Field | Quantum information science |
| Introduced | 1993 |
| Key people | Charles H. Bennett, Gilles Brassard, Claude Crépeau, Richard Jozsa, Asher Peres, William K. Wootters |
| Institutions | IBM, University of Bristol, Yale University, University of Innsbruck, National Institute of Standards and Technology, Delft University of Technology |
| Notable awards | Nobel Prize in Physics, Dirac Prize, Wolf Prize |
quantum teleportation is a protocol in quantum mechanics that transfers the quantum state of a discrete system from one location to another, using pre-shared entanglement and classical communication. Proposed in 1993 by a team of researchers, it does not transmit matter or energy but reconstructs an exact replica of the original state at the destination while destroying the original, consistent with the no-cloning theorem. The protocol lies at the intersection of quantum information theory, quantum optics, and quantum computing and has driven both fundamental tests of quantum nonlocality and engineering efforts in quantum networks.
The 1993 protocol was published by Charles H. Bennett, Gilles Brassard, Claude Crépeau, Richard Jozsa, Asher Peres, and William K. Wootters and built on earlier work in quantum entanglement and Bell theorem research. It uses a pair of entangled particles prepared by a sender or a third party and distributed to a sender and receiver, along with two bits of classical information transmitted via ordinary channels. Landmark experimental milestones include demonstrations by groups at University of Innsbruck, Yale University, and University of Bristol that established teleportation of photonic, atomic, and solid-state qubits. The protocol became a cornerstone concept influencing programs at institutions such as National Institute of Standards and Technology, Delft University of Technology, and technology companies like IBM.
The theoretical foundation rests on entanglement as formalized by John S. Bell and mathematical tools by Paul Dirac and John von Neumann. The protocol encodes an unknown state |ψ〉 on a system at the sender (commonly labeled Alice in literature influenced by works of Charles H. Bennett and Gilles Brassard) and consumes an entangled Bell pair pre-shared with the receiver (commonly labeled Bob, a naming convention with cultural ties to cryptography history involving Claude Crépeau and Richard Jozsa). A joint measurement in the Bell basis projects the combined state, collapsing entanglement and yielding two classical bits. Those bits, when received, instruct the receiver to apply one of four local unitary operations—often Pauli operators introduced by Wolfgang Pauli—to reconstruct |ψ〉 exactly. The procedure exemplifies constraints like the no-cloning theorem established by Wootters and others, and it respects relativistic causality by requiring classical communication bounded by the speed of light, echoing concerns raised in debates involving Albert Einstein, Boris Podolsky, and Nathan Rosen.
Formal generalizations draw on resources and formalisms from Claude Shannon's information theory and Alexander Holevo's bound, linking teleportation fidelity to measures such as entanglement of formation associated with work by Wootters and William K. Wootters. Extensions include continuous-variable teleportation framed using techniques from Roy J. Glauber's quantum optics, and entanglement swapping protocols introduced in later theoretical work connected to researchers at IBM and Los Alamos National Laboratory.
Early optical teleportation demonstrations by teams at University of Innsbruck and Yale University used entangled photon pairs created through spontaneous parametric down-conversion, a method pioneered by groups linked to Herbert Walther and Anton Zeilinger. Later, matter-based teleportation used trapped ions at institutions like National Institute of Standards and Technology and circuits in superconducting platforms developed at Delft University of Technology and IBM's research labs. Notable experiments include long-distance free-space photonic teleportation performed by groups affiliated with University of Vienna and satellite-linked demonstrations building on collaborations with agencies such as European Space Agency and institutions like Tsinghua University. Quantum teleportation between disparate systems—photons to atoms, photons to solid-state spins—was realized in experiments at University of Bristol and University of Cambridge, verifying interface protocols necessary for heterogeneous quantum networks.
Entanglement distribution and quantum repeaters tested by teams at Los Alamos National Laboratory, Caltech, and University of Oxford advanced scalable implementations, while fidelity benchmarking used tomographic techniques related to work by Eugene Wigner and Leonard Mandel. These experimental streams informed roadmap efforts by national laboratories including National Institute of Standards and Technology and multinational collaborations tied to European Research Council funding.
Quantum teleportation underpins proposed architectures for quantum repeaters, long-distance quantum communication systems, and modular quantum computing where distributed processors exchange quantum states. It impacts cryptographic protocols originating in work by Gilles Brassard and collaborators, and it integrates with quantum error correction schemes influenced by Peter Shor and Andrew Steane. Teleportation is central to concepts for quantum internet initiatives advocated by institutions like University of Oxford and MIT and funded in part by bodies such as National Science Foundation and European Commission programs. Philosophically and foundationally, teleportation experiments continue to inform debates linked to Albert Einstein's critiques and later interpretations by scholars such as Niels Bohr and John Bell.
Practical deployment faces limits set by decoherence described in studies connected to Lev Landau and Max Born, finite entanglement generation rates constrained by experimental platforms at University of Innsbruck and Yale University, and losses in photonic channels addressed by optical engineering groups at Caltech and University of Cambridge. Scalability confronts error thresholds relevant to theories by Peter Shor and experimental capabilities at companies like IBM and research centers including Delft University of Technology. Security analyses must reconcile classical-channel vulnerabilities overseen by standards from institutions like National Institute of Standards and Technology and cryptographers influenced by Claude Shannon. Future progress depends on advances in entanglement distribution, quantum memory performance found in studies at NIST and University of Vienna, and integration across platforms championed by consortia involving European Space Agency, European Research Council, and national funding agencies.