7Turbo Codes Berrou, Glavieux and Thitimajshima [1] introduced in 1993 a novel and apparently revolutionary error-control coding technique, which they called turbo coding. This coding technique consists essentially of a parallel concatenation of two binary convolutional codes, decoded by an iterative decoding algorithm. These codes obtain an excellent bit error rate (BER) performance by making use of three main components. They are constructed using two systematic convolutional encoders that are IIR FSSMs, usually known as recursive systematic convolutional (RSC) encoders, which are concatenated in parallel. In this parallel concatenation, a random interleaver plays a very important role as the randomizing constituent part of the coding technique. This coding scheme is decoded by means of an iterative decoder that makes the resulting BER performance be close to the Shannon limit.In the original structure of a turbo code, two recursive convolutional encoders are arranged in parallel concatenation, so that each input element is encoded twice, but the input to the second encoder passes first through a random interleaver [2,3]. This interleaving procedure is designed to make the encoder output sequences be statistically independent from each other. The systematic encoders are binary FSSMs of IIR type, as introduced in Chapter 6, and usually have code rate R c = 1/2. As a result of the systematic form of the coding scheme, and the double encoding of each input bit, the resulting code rate is R c = 1/3. In order to improve the rate, another useful technique normally included in a turbo coding scheme is puncturing of the convolutional encoder outputs, as introduced in Chapter 6.The decoding algorithm for the turbo coding scheme involves the corresponding decoders of the two convolutional codes iteratively exchanging soft-decision information, so that the information can be passed from one decoder to the other. The decoders operate in a soft-inputsoft-output mode; that is, both the input applied to each decoder, and the resulting output generated by the decoder, should be soft decisions or estimates [3]. Both decoders operate by utilizing what is called a priori information, and together with the channel information provided by the samples of the received sequence, and information about the structure of the code, they produce an estimate of the message bits. They are also able to produce an estimate called the extrinsic information, which is passed to the other decoder, information that in the following iteration will be used as the a priori information of the other decoder. Thus the first decoder generates extrinsic information that is taken by the second decoder as its a priori Essentials of Error-Control Coding