LLMpediaThe first transparent, open encyclopedia generated by LLMs

Turbo codes

Generated by DeepSeek V3.2
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: information theory Hop 4
Expansion Funnel Raw 59 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted59
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Turbo codes
NameTurbo codes
ClassificationForward error correction
InventorsClaude Berrou, Alain Glavieux, Punya Thitimajshima
Year1993
Related toLow-density parity-check code, Convolutional code, Iterative decoding

Turbo codes. Turbo codes are a class of high-performance forward error correction codes that revolutionized digital communications by approaching the theoretical limits of channel capacity, as defined by Claude Shannon's Shannon–Hartley theorem. First introduced in 1993 by Claude Berrou, Alain Glavieux, and Punya Thitimajshima at the International Conference on Communications, their novel structure and iterative decoding algorithm enabled unprecedented performance over noisy channels. This breakthrough directly influenced the development of modern standards like 3GPP Long Term Evolution and deep-space communications protocols.

Introduction

The seminal paper presented at the 1993 IEEE International Conference on Communications in Geneva demonstrated that turbo codes could perform within a fraction of a decibel of the Shannon limit. This result, initially met with skepticism, was soon validated by researchers at institutions like École Nationale Supérieure des Télécommunications de Bretagne and Jet Propulsion Laboratory. The core innovation lay in the parallel concatenation of two or more simple convolutional encoders separated by an interleaver, coupled with an iterative feedback decoding process reminiscent of a turbocharger, hence the name. Their emergence coincided with and accelerated advances in mobile telephony, particularly within the European Telecommunications Standards Institute framework for UMTS.

Encoding process

A classic turbo encoder employs two identical recursive systematic convolutional encoders. The first encoder processes the original input bit sequence directly, while the second encoder processes a pseudo-randomly permuted version of the input sequence created by a large block interleaver. The output typically consists of the systematic bits from the first encoder along with the parity bits generated by both constituent encoders, resulting in a code rate that can be adjusted through puncturing. This parallel concatenation, a key departure from traditional serial concatenated convolutional codes, creates a composite code with a complex, sparse parity-check matrix structure that exhibits excellent distance spectrum properties.

Decoding algorithm

Decoding is performed using an iterative, probabilistic algorithm where two soft-input soft-output decoders, usually based on the BCJR algorithm or the Soft Output Viterbi Algorithm, exchange extrinsic information. Each decoder, corresponding to one constituent encoder, processes the received systematic and parity bits along with *a priori* information, generating refined *a posteriori* probabilities. The extrinsic information from one decoder, after passing through the interleaver or deinterleaver, serves as the *a priori* input for the other decoder in the next iteration. This feedback loop, managed by a scheduler, allows the decoders to converge on a reliable estimate of the transmitted bits, dramatically reducing the bit error rate with each pass.

Performance and applications

Turbo codes provided a monumental leap in power efficiency for satellite and wireless systems, enabling reliable data transmission at very low signal-to-noise ratios. They were adopted as a mandatory channel coding scheme for the data channels in the 3GPP UMTS standard and the CDMA2000 standard, and were also incorporated into protocols for deep-space communications by the Consultative Committee for Space Data Systems. Their performance in Additive white Gaussian noise channels and fading channels made them instrumental in the evolution of mobile broadband access. Comparative studies with contemporary codes like Reed–Solomon error correction codes and later LDPC codes were extensively published in journals like IEEE Transactions on Information Theory.

Variants and extensions

Numerous variants have been developed to address specific constraints or improve performance. Turbo trellis-coded modulation integrates coding with modulation schemes like Quadrature amplitude modulation for bandwidth efficiency. Serial concatenated convolutional codes offer an alternative topology, while Duobinary turbo codes improve performance for high code rates and were selected for the Digital Video Broadcasting standards like DVB-RCS and DVB-RCT. The principles of iterative decoding directly inspired the rediscovery and optimization of LDPC codes by researchers like David J.C. MacKay. Further research explored applications in concatenated error correction code structures, distributed source coding, and iterative detection and decoding for MIMO systems.

Category:Error detection and correction Category:Coding theory