LLMpediaThe first transparent, open encyclopedia generated by LLMs

Nyquist–Shannon sampling theorem

Generated by DeepSeek V3.2
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Bell Labs Hop 3
Expansion Funnel Raw 54 → Dedup 31 → NER 7 → Enqueued 6
1. Extracted54
2. After dedup31 (None)
3. After NER7 (None)
Rejected: 24 (not NE: 24)
4. Enqueued6 (None)
Similarity rejected: 1
Nyquist–Shannon sampling theorem
NameNyquist–Shannon sampling theorem
FieldSignal processing, Information theory
Conjectured byHarry Nyquist
Proved byClaude Shannon
Year proved1949

Nyquist–Shannon sampling theorem. The Nyquist–Shannon sampling theorem is a fundamental principle in the fields of signal processing and information theory. It establishes the conditions under which a continuous-time signal can be perfectly reconstructed from a sequence of its discrete samples. The theorem is critical for the conversion between analog and digital domains, underpinning modern technologies in telecommunications, audio engineering, and medical imaging.

Statement of the theorem

The theorem states that a bandlimited signal containing no frequency components higher than a certain bandwidth *B* hertz can be completely reconstructed from its samples, provided the samples are taken at a uniform rate exceeding 2*B* samples per second. This minimum sampling rate, known as the Nyquist rate, is twice the highest frequency present in the signal. If the signal is sampled at exactly this rate, the reconstruction process involves the use of an ideal low-pass filter, often mathematically represented by the sinc function. The precise formulation was rigorously presented by Claude Shannon in his seminal 1949 paper, building upon earlier work by Harry Nyquist at Bell Labs.

Aliasing

When a signal is sampled at a rate lower than the Nyquist rate, a distortion known as aliasing occurs. In this condition, high-frequency components of the original signal are misrepresented as lower frequencies in the reconstructed signal, leading to irreversible information loss. This phenomenon is visually demonstrated by the wagon-wheel effect in film, where spoked wheels appear to rotate backwards. To prevent aliasing, practical systems employ an anti-aliasing filter before the sampling stage, which is typically a low-pass filter designed to attenuate frequencies above the Nyquist limit. The study of aliasing is essential in applications like digital audio and the design of analog-to-digital converter circuits.

Derivation

The derivation often begins by considering a signal *x(t)* with a Fourier transform *X(f)* that is zero for all frequencies |*f*| > *B*. The sampling process is modeled as multiplication by a Dirac comb, a periodic train of Dirac delta functions spaced at the sampling interval *T*. This multiplication in the time domain corresponds to convolution with another Dirac comb in the frequency domain, resulting in periodic copies of the original spectrum centered at multiples of the sampling frequency. If the sampling frequency is greater than 2*B*, these spectral copies do not overlap. Perfect reconstruction is then achieved by applying an ideal rectangular function in the frequency domain, which corresponds to convolution with a sinc function in the time domain, a process formalized by the Whittaker–Shannon interpolation formula.

Shannon's original proof

Claude Shannon provided a concise and elegant proof in his 1948 monograph, *A Mathematical Theory of Communication*, published in the Bell System Technical Journal. His proof leveraged the properties of the Fourier series and the concept that a bandlimited signal is completely determined by its samples at the Nyquist rate. Shannon framed the problem within the broader context of information theory, linking the sampling theorem to the channel capacity of a noisy channel. This work built directly on the foundational contributions of Harry Nyquist, who in 1928 determined the maximum signaling speed over a telegraph channel, a concept now known as the Nyquist ISI criterion.

Applications

The theorem is the cornerstone of modern digital technology. In audio engineering, it dictates the sampling rate standards for compact disc digital audio (44.1 kHz) and professional formats used by Dolby Laboratories. It is fundamental to the operation of software-defined radio and the design of digital signal processor chips. In medical diagnostics, it enables the accurate digital representation of signals from magnetic resonance imaging and computed tomography scanners. The principles are also applied in geophysics for seismic data acquisition and in astronomy for processing signals from radio telescopes like the Very Large Array.

Historical background

The origins of the sampling concept can be traced to the work of John Whittaker and his son E. T. Whittaker on cardinal series interpolation. The practical necessity for the theorem arose from developments in pulse-code modulation, pioneered by Alec Reeves at International Telephone and Telegraph. Vladimir Kotelnikov independently presented a similar theory in the Soviet Union in 1933. However, it was the synthesis by Claude Shannon at Bell Labs, informed by Harry Nyquist's earlier research on telegraphy, that established the theorem's central role in information theory. This history was later clarified through the scholarship of individuals like Hans Dieter Lüke, highlighting the contributions of multiple researchers across different nations.

Category:Signal processing Category:Theorems in information theory Category:Telecommunication theory