1.2.
CONVOLUTIONAL CODES AND TRELLIS CODES
27
variable X = U(2) – U
(1).
Therefore, the Chernoff bound implies that
where is the upper limit of the interval over which the expected value is
defined. Depending on which version of the Chernoff bound is valid, either
If are independent,
identically distributed random variables and we define
then
This bound is often much simpler to compute than the exact As in-
creases, the central-limit theorem implies that the distribution of X = U(2) –
U(1) approximates the Gaussian distribution. Thus, for large enough (1-95)
is satisfied when E[X] < 0, and we can set in (1-103). For small (1-
95) may be difficult to establish mathematically, but is often intuitively clear;
if not, setting in (1-103) is always valid.
These results can be applied to hard-decision decoding, which can be re-
garded as a special case of soft-decision decoding with the following symbol
metric. If symbol of a candidate binary sequence agrees with the corre-
sponding detected symbol at the demodulator output, then oth-
erwise Therefore, in (1-102) is equal to +1 with
probability and –1 with probability Thus,
for hard-decision decoding. Substituting this equation into (1-103) with
we obtain
This upper bound is not always tight but has great generality since no specific
assumptions have been made about the modulation or coding.
1.2
Convolutional Codes and Trellis Codes
In contrast to a block codeword, a convolutional codeword represents an entire
message of indefinite length. A convolutional encoder converts an input of
information bits into an output of code bits that are Boolean functions of
both the current input bits and the preceding information bits. After bits
are shifted into a shift register and bits are shifted out, code bits are read
out. Each code bit is a Boolean function of the outputs of selected shift-register
or