Generated by GPT-5-mini| Hartley law | |
|---|---|
| Name | Hartley law |
| Field | Signal processing |
| Introduced | 1928 |
| Inventor | Ralph V. Hartley |
| Related | Nyquist–Shannon sampling theorem, Fourier transform, information theory |
Hartley law
Hartley law is a foundational relation in signal processing and information theory proposed by Ralph V. Hartley in 1928 that links signal bandwidth, time, and information capacity. It precedes and influenced later results such as the Nyquist–Shannon sampling theorem and Claude Shannon's formulations, intersecting with work at institutions like Bell Laboratories and universities including Massachusetts Institute of Technology and Harvard University. The law has been invoked in engineering practice across industries from Bell System telephony to modern telecommunications standards developed by bodies like the International Telecommunication Union and IEEE.
Hartley law asserts that the amount of information that can be transmitted is proportional to the product of the signal bandwidth and the duration of the signal under specified constraints. In original form Hartley related measurable quantities in telegraphy and telephony analogous to relations studied by Harry Nyquist and later formalized by Claude Shannon, expressing capacity in terms of logarithmic measures tied to signal amplitude ranges. The statement influenced analyses used at AT&T and in studies like the Bell System Technical Journal, and it provided groundwork for theoretical treatments by researchers at Princeton University and Columbia University.
The origin of the law traces to research by Ralph V. Hartley at Western Electric and later publications in the late 1920s amidst contemporaneous work by Harry Nyquist and experimental efforts at Bell Labs. Hartley's 1928 paper responded to industrial challenges faced by Western Electric Company and the broader Bell System in increasing channel efficiency for telegraph, telephone exchange and early radio systems such as those operated by the BBC and Marconi Company. The intellectual lineage includes influence from mathematical tools developed by Joseph Fourier and conceptual antecedents in the work of Oliver Heaviside and researchers at General Electric. The law became a touchstone for postwar research at institutions like RAND Corporation and in academic programs at California Institute of Technology and Stanford University.
Hartley expressed capacity C (in dimensionless units or "Hartleys") as proportional to bandwidth B and time T, yielding a relation C ∝ B·T with logarithmic base reflecting the number of distinguishable signal levels. Formal derivations typically invoke the Fourier transform and modal counting arguments used in the analysis of channels governed by linear time-invariant systems, similar in spirit to treatments by Norbert Wiener and Andrey Kolmogorov on stochastic processes. Extensions relate Hartley’s original logarithmic scaling to Shannon's entropy via the change of base between natural logarithms used by Leonhard Euler traditions and logarithms adopted in information theory texts by Cover and Thomas. Mathematically, Hartley’s count of symbols N across degrees of freedom equals (2B·T) times the logarithm of signal amplitude ratio, paralleling modal counting in works by Lord Rayleigh and spectral decompositions used in Wiener filter design.
Engineers applied Hartley law to optimize channel utilization in systems such as frequency-division multiplexing deployments at AT&T Long Lines, early radio broadcasting networks of BBC, and satellite links developed by agencies like NASA and firms such as Hughes Aircraft Company. In digital systems, Hartley’s proportionality informs capacity planning for orthogonal frequency-division multiplexing schemes standardized by 3GPP and IEEE 802.11 families, and it underlies practical heuristics used by designers at Intel Corporation and Qualcomm. Examples include calculating maximum symbol counts in a telephone channel allocated by International Telecommunication Union recommendations, spectral occupancy planning in Federal Communications Commission allocations, and estimating throughput in experimental systems at research centers like MIT Lincoln Laboratory and Bell Labs.
Critics note Hartley law’s idealizations: it assumes noiseless channels or treats noise only through coarse amplitude discretization, a limitation highlighted by subsequent rigorous treatments by Claude Shannon and stochastic analyses by Norbert Wiener and Andrey Kolmogorov. Practical deviations arise in channels with multipath propagation studied in ITU-R reports, nonstationary interference problems examined by researchers at ETH Zurich and University of Cambridge, and coding limits resolved via error-correcting codes from work at Bell Labs and MIT. Regulatory and deployment contexts, such as spectrum policy at the Federal Communications Commission and coexistence regimes addressed by European Telecommunications Standards Institute, show Hartley’s rule is a useful heuristic but insufficient for designing systems where noise figure and coding gain from Reed–Solomon codes or turbo codes dominate performance trade-offs.