Generated by GPT-5-mini| Discrete memoryless channel | |
|---|---|
| Name | Discrete memoryless channel |
| Field | Information theory |
| Introduced | 1948 |
| Notable | Claude Shannon |
| Applications | Telecommunication, Cryptography, Coding theory |
Discrete memoryless channel
A discrete memoryless channel is a probabilistic model in information theory describing a channel with a finite input alphabet and finite output alphabet where each channel use is statistically independent. It formalizes noisy transmission as a conditional probability matrix relating symbols at the sender and receiver, and underpins fundamental results connecting Claude Shannon, Shannon's noisy channel coding theorem, Norbert Wiener, Harry Nyquist, and Rudolf E. Kálmán-era signal ideas to modern telecommunication systems. The model is central to developments recognized by awards such as the IEEE Medal of Honor and institutions like the Bell Labs and Institute of Electrical and Electronics Engineers.
A discrete memoryless channel (DMC) is defined by a finite input set X and finite output set Y together with transition probabilities P(y|x) for x in X and y in Y; successive uses do not depend on past symbols. The canonical mathematical representation uses a stochastic matrix mapping inputs to outputs, a concept informing work at Bell Labs, AT&T, IBM Research, MIT, and Stanford University. Foundational contributors include Claude Shannon, Richard Hamming, David A. Huffman, and Robert Fano, with pedagogical expositions at institutions like California Institute of Technology and Princeton University. Related experimental platforms and standards include developments at Nokia, Ericsson, Qualcomm, and ITU specifications.
Channel capacity for a DMC is the supremum of achievable reliable communication rates, quantified by maximizing mutual information I(X;Y) over input distributions; this result stems from Claude Shannon's 1948 theorem proven using random coding and typicality arguments. Proving achievability and converse often invokes techniques developed by Richard Hamming, David Slepian, Jack Keil Wolf, Jacob Ziv, Imre Csiszár, and Tibor Gallai-adjacent combinatorial methods used at Bell Labs and IBM Research. Practical coding schemes approximating capacity were advanced by researchers at Nokia Bell Labs, Ericsson Research, Nokia, Huawei, Toyota Research Institute and led to modern codes such as those from Robert Gallager (LDPC), Claude Berrou (turbo codes), Erdal Arikan (polar codes), and Richard Hamming (Hamming codes). Information-theoretic limits influence standards from 3GPP, IEEE 802.11, ITU-T, and encryption frameworks at NSA and NIST.
Canonical DMC instances include the binary symmetric channel (BSC), binary erasure channel (BEC), and Z-channel, archetypes analyzed by Richard Hamming, Robert Fano, Peter Elias, and Thomas M. Cover. The BSC models independent bit flips and has capacity 1 - H(p), a formula taught at MIT, Stanford University, University of California, Berkeley, and ETH Zurich. The BEC captures erasures relevant to Tanner graph decoders and was instrumental in the development of LDPC by Robert G. Gallager and practical systems at Intel and Broadcom. Other discrete models tie into coding used by NASA missions, European Space Agency, SpaceX, and error-control systems in Sony and Samsung consumer devices.
Error probability analyses for DMCs employ union bounds, typical set decoding, maximum likelihood decoding, and list decoding, with theoretical advances by Richard Hamming, Robert Gallager, David Forney, Elias, and Imre Csiszár. Maximum likelihood decoding connects to algorithms developed at Bell Labs and implemented in hardware by Intel and Qualcomm; successive cancellation decoding for polar codes was introduced by Erdal Arikan and refined in industry by Samsung and Huawei. Techniques to bound error exponents and derive reliability functions trace to Shannon, Gallager, Blahut, and institutions like Bell Labs and Princeton University, and inform standards committees such as 3GPP and IEEE.
Extensions of the DMC include channels with memory (finite-state channels), continuous-output channels like the additive white Gaussian noise channel studied by Harry Nyquist and Claude Shannon, and multiuser generalizations including the multiple access channel (MAC), broadcast channel, and relay channel developed by Cover and Thomas school and researchers at Bell Labs and University of California, Berkeley. Quantum generalizations lead to quantum channels explored at IBM Research, Google Quantum AI, MIT, and Caltech, with capacity concepts linked to Peter Shor and A. S. Holevo. Network information theory, with contributors such as Thomas Cover, Abraham Wyner, El Gamal, Timothy M. Cover, and Sergio Verdú, expands DMC foundations to complex systems like 5G networks standardized by 3GPP and satellite links overseen by INTELSAT and EUTELSAT.