coding and decoding with convolutional codes the viterbi
play

Coding and decoding with convolutional codes. The Viterbi Algorithm. - PowerPoint PPT Presentation

Convolutional encoding Finite State Machine Channel models The Viterbi algorithm Coding and decoding with convolutional codes. The Viterbi Algorithm. J.-M. Brossier 2008 J.-M. Brossier Coding and decoding with convolutional codes. The


  1. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Linear block codes, e.g. Hamming codes. A binary linear block code takes k information bits at its input and calculates n bits. If the 2 k codewords are enough and well spaced in the n -dim space, it is possible to detect or even correct errors. In 1950, Hamming introduced the (7,4) Hamming code. It encodes 4 data bits into 7 bits by adding three parity bits. It can detect and correct single-bit errors but can only detect double-bit errors. The code parity-check matrix is:   1 0 1 0 1 0 1 H = 0 1 1 0 0 1 1   0 0 0 1 1 1 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  2. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Linear block codes, e.g. Hamming codes. A binary linear block code takes k information bits at its input and calculates n bits. If the 2 k codewords are enough and well spaced in the n -dim space, it is possible to detect or even correct errors. In 1950, Hamming introduced the (7,4) Hamming code. It encodes 4 data bits into 7 bits by adding three parity bits. It can detect and correct single-bit errors but can only detect double-bit errors. The code parity-check matrix is:   1 0 1 0 1 0 1 H = 0 1 1 0 0 1 1   0 0 0 1 1 1 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  3. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Linear block codes, e.g. Hamming codes. A binary linear block code takes k information bits at its input and calculates n bits. If the 2 k codewords are enough and well spaced in the n -dim space, it is possible to detect or even correct errors. In 1950, Hamming introduced the (7,4) Hamming code. It encodes 4 data bits into 7 bits by adding three parity bits. It can detect and correct single-bit errors but can only detect double-bit errors. The code parity-check matrix is:   1 0 1 0 1 0 1 H = 0 1 1 0 0 1 1   0 0 0 1 1 1 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  4. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Block codes: main ideas Linear block codes, e.g. Hamming codes. A binary linear block code takes k information bits at its input and calculates n bits. If the 2 k codewords are enough and well spaced in the n -dim space, it is possible to detect or even correct errors. In 1950, Hamming introduced the (7,4) Hamming code. It encodes 4 data bits into 7 bits by adding three parity bits. It can detect and correct single-bit errors but can only detect double-bit errors. The code parity-check matrix is:   1 0 1 0 1 0 1 H = 0 1 1 0 0 1 1   0 0 0 1 1 1 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  5. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  6. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  7. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  8. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  9. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  10. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Convolutional encoding: main ideas In convolutional codes, each block of k bits is mapped into a block of n bits BUT these n bits are not only determined by the present k information bits but also by the previous information bits. This dependence can be captured by a finite state machine. This is achieved using several linear filtering operations: Each convolution imposes a constraint between bits. Several convolutions introduce the redundancy. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  11. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix A convolutional code can be described by an “infinite matrix”: G 0 G 1 · · · G M 0 k × n · · ·   0 k × n G 0 · · · G M − 1 G M   . . .  ... ... ...  . . . . . .     .   . .  0 k × n G 0 G 1  G =   . ... ...  .  . G 0     . ...  .  . 0 k × n     ... This matrix depends on K = M + 1 k × n sub-matrices { G i } i =0 .. M . K is known as the constraint length of the code. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  12. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix A convolutional code can be described by an “infinite matrix”: · · · · · ·  G 0 G 1 G M 0 k × n  0 k × n G 0 · · · G M − 1 G M   . . . ... ... ...   . . .  . . .    .   . . 0 k × n G 0 G 1   ( C 0 , C 1 · · · ) = ( I 0 , I 1 · · · )   . ... ...   . . G 0     . ...  .  . 0 k × n     ... It looks like a block coding: C = IG J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  13. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix Denoting by: I j = ( I j 1 · · · I jk ) the j th block of k informative bits, C j = ( C j 1 · · · C jn ) a block of n coded bits at the output. Coding an infinite sequence of blocks (length k ) I = ( I 0 I 1 · · · ) produces an infinite sequence C = ( C 0 C 1 · · · ) of coded blocks (length n ). C 0 = I 0 G 0 C 1 = I 0 G 1 + I 1 G 0 Block form of the coding . . . scheme: it looks like a block C M = I 0 G M + I 1 G M − 1 + · · · + I M G 0 coding: . . . C = IG C j = I j − M G M + · · · + I j G 0 for j ≥ M . . . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  14. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix Denoting by: I j = ( I j 1 · · · I jk ) the j th block of k informative bits, C j = ( C j 1 · · · C jn ) a block of n coded bits at the output. Coding an infinite sequence of blocks (length k ) I = ( I 0 I 1 · · · ) produces an infinite sequence C = ( C 0 C 1 · · · ) of coded blocks (length n ). C 0 = I 0 G 0 C 1 = I 0 G 1 + I 1 G 0 Block form of the coding . . . scheme: it looks like a block C M = I 0 G M + I 1 G M − 1 + · · · + I M G 0 coding: . . . C = IG C j = I j − M G M + · · · + I j G 0 for j ≥ M . . . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  15. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix Denoting by: I j = ( I j 1 · · · I jk ) the j th block of k informative bits, C j = ( C j 1 · · · C jn ) a block of n coded bits at the output. Coding an infinite sequence of blocks (length k ) I = ( I 0 I 1 · · · ) produces an infinite sequence C = ( C 0 C 1 · · · ) of coded blocks (length n ). C 0 = I 0 G 0 C 1 = I 0 G 1 + I 1 G 0 Block form of the coding . . . scheme: it looks like a block C M = I 0 G M + I 1 G M − 1 + · · · + I M G 0 coding: . . . C = IG C j = I j − M G M + · · · + I j G 0 for j ≥ M . . . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  16. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix performs a convolution Using the convention I i = 0 for i < 0, the encoding structure C = IG is clearly a convolution : M � C j = I j − l G l . l =0 For an informative bits sequence I whose length is finite,only L < + ∞ blocks of k bits are different from zero at the input of the coder: I = ( I 0 · · · I L − 1 ). The sequence C = ( C 0 · · · C L − 1+ M ) at the coder output is finite too. This truncated coded sequence is generated by a linear block code whose generator matrix is a size kL × n ( L + M ) sub-matrix of G J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  17. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Infinite generator matrix performs a convolution Using the convention I i = 0 for i < 0, the encoding structure C = IG is clearly a convolution : M � C j = I j − l G l . l =0 For an informative bits sequence I whose length is finite,only L < + ∞ blocks of k bits are different from zero at the input of the coder: I = ( I 0 · · · I L − 1 ). The sequence C = ( C 0 · · · C L − 1+ M ) at the coder output is finite too. This truncated coded sequence is generated by a linear block code whose generator matrix is a size kL × n ( L + M ) sub-matrix of G J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  18. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Let us write g ( l ) αβ elements of matrix G l . We now expand the convolution C j = � M l =0 I j − l G l to explicit the n components C j 1 , · · · , C jn of each output block C j : � M k M k � I j − l ,α g ( l ) I j − l ,α g ( l ) � � � � C j = [ C j 1 , · · · , C jn ] = α 1 , · · · , α n l =0 α =1 l =0 α =1 If the length of the shift register is L , there are M L different internal configurations. The behavior of the convolutional coder can be captured by a M L states machine. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  19. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization k M � � I j − l ,α g ( l ) C j β = αβ α =1 l =0 depends on: the present input I j M previous input blocks I j − 1 , · · · , I j − M . C j β can be calculated by memorizing M input values in shift registers One shift register α ∈ 1 · · · k for each k bits of the input. For register α , only memories for which g ( l ) αβ = 1 are connected to adder β ∈ 1 · · · n . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  20. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization k M � � I j − l ,α g ( l ) C j β = αβ α =1 l =0 depends on: the present input I j M previous input blocks I j − 1 , · · · , I j − M . C j β can be calculated by memorizing M input values in shift registers One shift register α ∈ 1 · · · k for each k bits of the input. For register α , only memories for which g ( l ) αβ = 1 are connected to adder β ∈ 1 · · · n . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  21. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization k M � � I j − l ,α g ( l ) C j β = αβ α =1 l =0 depends on: the present input I j M previous input blocks I j − 1 , · · · , I j − M . C j β can be calculated by memorizing M input values in shift registers One shift register α ∈ 1 · · · k for each k bits of the input. For register α , only memories for which g ( l ) αβ = 1 are connected to adder β ∈ 1 · · · n . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  22. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization k M � � I j − l ,α g ( l ) C j β = αβ α =1 l =0 depends on: the present input I j M previous input blocks I j − 1 , · · · , I j − M . C j β can be calculated by memorizing M input values in shift registers One shift register α ∈ 1 · · · k for each k bits of the input. For register α , only memories for which g ( l ) αβ = 1 are connected to adder β ∈ 1 · · · n . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  23. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Rate of a convolutional code Asymptotic rate For each k bits long block at the input, a n bits long block is generated at the output. At the coder output, the ratio [number of informative bits] over [total number of bits] is given by: R = k n This quantity is called the rate of the code. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  24. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Rate of a convolutional code Finite length rate For a finite length input sequence, the truncating reduces the rate. The exact finite-length rate is exactly: L r ′ = r L + M For L >> M , this rate is almost equal to the asymptotic rate. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  25. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . 2 convolutions are evaluated in parallel. The output of each convolution depends on one input and on 3 values memorized in the shift register. At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  26. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . 2 convolutions are evaluated in parallel. The output of each convolution depends on one input and on 3 values memorized in the shift register. At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  27. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . 2 convolutions are evaluated in parallel. The output of each convolution depends on one input and on 3 values memorized in the shift register. At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  28. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . 2 convolutions are evaluated in parallel. The output of each convolution depends on one input and on 3 values memorized in the shift register. At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  29. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 4 Sub-matrices: G 0 = [11] , G 1 = [11] , G 2 = [10] , G 3 = [11]. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  30. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 4 Sub-matrices: G 0 = [11] , G 1 = [11] , G 2 = [10] , G 3 = [11]. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  31. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 4 Sub-matrices: G 0 = [11] , G 1 = [11] , G 2 = [10] , G 3 = [11]. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  32. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder. 1 input ( k = 1), 2 outputs ( n = 2). C j 1 + 1 + D + D 2 + D 3 1 1 1 1 1 input 2 outputs D D D 1 + D + D 3 1 1 0 1 G 2 G 3 G 0 G 1 C j 2 + Impulse responses are P ( D ) = 1 + D + D 2 + D 3 and Q ( D ) = 1 + D + D 3 . Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 4 Sub-matrices: G 0 = [11] , G 1 = [11] , G 2 = [10] , G 3 = [11]. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  33. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 + C j 1 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 G 0 = 1 ⎛ 0 1 ⎞ G 1 = 0 ⎛ 0 1 ⎞ ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ ⎝ ⎠ 0 1 0 0 0 1 3 convolutions are evaluated in parallel. The output of each convolution depends on two inputs and on 4 values memorized in the shift registers. At each step, the 3 values at the output depend on the inputs and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  34. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 + C j 1 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 G 0 = 1 ⎛ 0 1 ⎞ G 1 = 0 ⎛ 0 1 ⎞ ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ ⎝ ⎠ 0 1 0 0 0 1 3 convolutions are evaluated in parallel. The output of each convolution depends on two inputs and on 4 values memorized in the shift registers. At each step, the 3 values at the output depend on the inputs and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  35. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 + C j 1 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 G 0 = 1 ⎛ 0 1 ⎞ G 1 = 0 ⎛ 0 1 ⎞ ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ ⎝ ⎠ 0 1 0 0 0 1 3 convolutions are evaluated in parallel. The output of each convolution depends on two inputs and on 4 values memorized in the shift registers. At each step, the 3 values at the output depend on the inputs and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  36. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 + C j 1 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 G 0 = 1 ⎛ 0 1 ⎞ G 1 = 0 ⎛ 0 1 ⎞ ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ ⎝ ⎠ 0 1 0 0 0 1 3 convolutions are evaluated in parallel. The output of each convolution depends on two inputs and on 4 values memorized in the shift registers. At each step, the 3 values at the output depend on the inputs and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  37. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 C j 1 + 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 ⎛ ⎞ ⎛ ⎞ G 0 = 1 0 1 G 1 = 0 0 1 ⎜ ⎟ ⎜ ⎟ ⎝ 0 1 0 ⎠ ⎝ 0 0 1 ⎠ Rate 2 / 3 ( k = 2, n = 3) Constraint length K = 2 � 1 � 0 � � 0 1 0 1 Sub-matrices: G 0 = and G 1 = 0 1 0 0 0 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  38. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 C j 1 + 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 ⎛ ⎞ ⎛ ⎞ G 0 = 1 0 1 G 1 = 0 0 1 ⎜ ⎟ ⎜ ⎟ ⎝ 0 1 0 ⎠ ⎝ 0 0 1 ⎠ Rate 2 / 3 ( k = 2, n = 3) Constraint length K = 2 � 1 � 0 � � 0 1 0 1 Sub-matrices: G 0 = and G 1 = 0 1 0 0 0 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  39. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 C j 1 + 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 ⎛ ⎞ ⎛ ⎞ G 0 = 1 0 1 G 1 = 0 0 1 ⎜ ⎟ ⎜ ⎟ ⎝ 0 1 0 ⎠ ⎝ 0 0 1 ⎠ Rate 2 / 3 ( k = 2, n = 3) Constraint length K = 2 � 1 � 0 � � 0 1 0 1 Sub-matrices: G 0 = and G 1 = 0 1 0 0 0 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  40. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 2 / 3 encoder 1 0 C j 1 + 0 0 1 1 I j = I j 1 , I j 2 ⎡ ⎤ C j 2 ⎣ ⎦ + 0 0 C j 3 + 0 1 1 0 ⎛ ⎞ ⎛ ⎞ G 0 = 1 0 1 G 1 = 0 0 1 ⎜ ⎟ ⎜ ⎟ ⎝ 0 1 0 ⎠ ⎝ 0 0 1 ⎠ Rate 2 / 3 ( k = 2, n = 3) Constraint length K = 2 � 1 � 0 � � 0 1 0 1 Sub-matrices: G 0 = and G 1 = 0 1 0 0 0 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  41. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + 2 convolutions are evaluated in parallel. The output of each convolution depends on one input I j , 1 and on 2 values memorized in the shift register I j − 1 , 1 and I j − 2 , 1 . At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  42. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + 2 convolutions are evaluated in parallel. The output of each convolution depends on one input I j , 1 and on 2 values memorized in the shift register I j − 1 , 1 and I j − 2 , 1 . At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  43. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + 2 convolutions are evaluated in parallel. The output of each convolution depends on one input I j , 1 and on 2 values memorized in the shift register I j − 1 , 1 and I j − 2 , 1 . At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  44. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + 2 convolutions are evaluated in parallel. The output of each convolution depends on one input I j , 1 and on 2 values memorized in the shift register I j − 1 , 1 and I j − 2 , 1 . At each step, the 2 values at the output depend on the input and the internal state. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  45. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 3 � � � � Sub-matrices: G 0 = 1 1 , G 1 = 0 1 and � � G 2 = 1 1 . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  46. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 3 � � � � Sub-matrices: G 0 = 1 1 , G 1 = 0 1 and � � G 2 = 1 1 . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  47. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 3 � � � � Sub-matrices: G 0 = 1 1 , G 1 = 0 1 and � � G 2 = 1 1 . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  48. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + Rate 1 / 2 ( k = 1, n = 2) Constraint length K = M + 1 = 3 � � � � Sub-matrices: G 0 = 1 1 , G 1 = 0 1 and � � G 2 = 1 1 . J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  49. Convolutional encoding Principles Finite State Machine 1st point of view: infinite length block code Channel models 2nd point of view: convolutions The Viterbi algorithm Some examples Shift registers based realization Rate 1 / 2 encoder + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + This rate 1 / 2 ( k = 1, n = 2) code is used in the sequel to explain the Viterbi algorithm. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  50. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine A state space An input that activates the transition from on state to another one. An output is generated during the transition. Usual representations: Transition diagram Lattice diagram J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  51. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine A state space An input that activates the transition from on state to another one. An output is generated during the transition. Usual representations: Transition diagram Lattice diagram J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  52. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine A state space An input that activates the transition from on state to another one. An output is generated during the transition. Usual representations: Transition diagram Lattice diagram J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  53. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine A state space An input that activates the transition from on state to another one. An output is generated during the transition. Usual representations: Transition diagram Lattice diagram J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  54. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine A state space An input that activates the transition from on state to another one. An output is generated during the transition. Usual representations: Transition diagram Lattice diagram J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  55. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: transition diagram A simple FSM (0,00) 00 (1,01) (0,11) (1,11) 10 (1,00) 01 (0,10) (0,01) 11 (1,10) The state space is composed of 4 elements: 00, 01, 11, 10. Each state is represented by a node. The input is binary valued: 2 arrows start at each node. Arrows are indexed by a couple of values: (input,output) the input that activates the transition and the output that is generated by this transition. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  56. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: transition diagram A simple FSM (0,00) 00 (1,01) (0,11) (1,11) 10 (1,00) 01 (0,10) (0,01) 11 (1,10) The state space is composed of 4 elements: 00, 01, 11, 10. Each state is represented by a node. The input is binary valued: 2 arrows start at each node. Arrows are indexed by a couple of values: (input,output) the input that activates the transition and the output that is generated by this transition. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  57. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: transition diagram A simple FSM (0,00) 00 (1,01) (0,11) (1,11) 10 (1,00) 01 (0,10) (0,01) 11 (1,10) The state space is composed of 4 elements: 00, 01, 11, 10. Each state is represented by a node. The input is binary valued: 2 arrows start at each node. Arrows are indexed by a couple of values: (input,output) the input that activates the transition and the output that is generated by this transition. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  58. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: transition diagram A simple FSM (0,00) 00 (1,01) (0,11) (1,11) 10 (1,00) 01 (0,10) (0,01) 11 (1,10) The state space is composed of 4 elements: 00, 01, 11, 10. Each state is represented by a node. The input is binary valued: 2 arrows start at each node. Arrows are indexed by a couple of values: (input,output) the input that activates the transition and the output that is generated by this transition. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  59. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: lattice diagram Finite State Machine One slice of a lattice diagram (0,00) 00 00 (0,00) (0,11) (1,11) 00 01 (0,01) 01 (1,01) (1,00) (1,11) (0,11) 10 10 10 (1,00) 01 (0,10) (0,10) (1,10) (0,01) 11 11 11 (1,01) j Time (1,10) j+1 Lattice diagram A new input triggers a transition from the present state to a one-step future step. A lattice diagram unwraps the behavior of the FSM as a function of time. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  60. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: lattice diagram Finite State Machine One slice of a lattice diagram (0,00) 00 00 (0,00) (0,11) (1,11) 00 01 (0,01) 01 (1,01) (1,00) (1,11) (0,11) 10 10 10 (1,00) 01 (0,10) (0,10) (1,10) (0,01) 11 11 11 (1,01) j Time (1,10) j+1 Lattice diagram A new input triggers a transition from the present state to a one-step future step. A lattice diagram unwraps the behavior of the FSM as a function of time. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  61. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Finite State Machine: lattice diagram Finite State Machine One slice of a lattice diagram (0,00) 00 00 (0,00) (0,11) (1,11) 00 01 (0,01) 01 (1,01) (1,00) (1,11) (0,11) 10 10 10 (1,00) 01 (0,10) (0,10) (1,10) (0,01) 11 11 11 (1,01) j Time (1,10) j+1 Lattice diagram A new input triggers a transition from the present state to a one-step future step. A lattice diagram unwraps the behavior of the FSM as a function of time. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  62. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM FSM of a Convolutional encoder A simple encoder ... + C j 1 I j 1 I j, 1 I j − 1 , 1 I j − 2 , 1 C j 2 + ... and the FSM of this encoder (0,00) 00 (1,01) (0,11) (1,11) 10 (1,00) 01 (0,10) (0,01) 11 (1,10) J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  63. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Coding a sequence using the coder FSM Coding bits 001 with a known automate. (0,00) Initial coder state: 00. 00 (1,01) (1,11) (0,11) Informative sequence: 001. 10 (1,00) 01 (0,10) (0,01) 11 (1,10) Information bearing bits enter the coder The first value at the coder input is: I 0 = 0. According to the transition diagram, this input activates the transition indexed by (0 , 00): this means input 0 generates output C 0 = (00). The transition is from state 00 to state 00. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  64. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Coding a sequence using the coder FSM Coding bits 001 with a known automate. (0,00) Initial coder state: 00. 00 (1,01) (1,11) (0,11) Informative sequence: 001. 10 (1,00) 01 (0,10) (0,01) 11 (1,10) Information bearing bits enter the coder The second value at the coder input is: I 1 = 0. According to the transition diagram, this input activates the transition indexed by (0 , 00): this means input 0 generates output C 1 = (00). The transition is from state 00 to state 00. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  65. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Coding a sequence using the coder FSM Coding bits 001 with a known automate. (0,00) Initial coder state: 00. 00 (1,01) (1,11) (0,11) Informative sequence: 001. 10 (1,00) 01 (0,10) (0,01) 11 (1,10) Information bearing bits enter the coder The last informative bit to enter the coder is 1. According to the transition diagram, this input activates the transition indexed by (1 , 11): this means input 1 generates output C 2 = (11). The transition is from state 00 to state 10. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  66. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Coding a sequence using the coder FSM Coding bits 001 with a known automate. (0,00) Initial coder state: 00. 00 (1,01) (1,11) (0,11) Informative sequence: 001. 10 (1,00) 01 (0,10) (0,01) 11 (1,10) Lattice closure: 2 zeros at the coder input to reset its state. The first 0 activates the transition indexed by (0 , 01), generates output C 3 = (01) and sets the state to 01. The second 0 activates the transition indexed by(0 , 11), generates output C 4 = (11) and resets the state to 00. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  67. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Coding a sequence using the coder FSM Coding bits 001 with a known automate. (0,00) Initial coder state: 00. 00 (1,01) (1,11) (0,11) Informative sequence: 001. 10 (1,00) 01 (0,10) (0,01) 11 (1,10) Received sequence In fine , the informative sequence 001 is encoded by: [ C 0 , C 1 , C 2 , C 3 , C 4 ] = [00 , 00 , 11 , 01 , 11] J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  68. Convolutional encoding Transition diagram Finite State Machine Lattice diagram Channel models A convolutional encoder is a FSM The Viterbi algorithm Coding a sequence using the coder FSM Coding a sequence using the coder FSM Noisy received coded sequence The coded sequence [ C 0 , C 1 , C 2 , C 3 , C 4 ] = [00 , 00 , 11 , 01 , 11] is transmitted over a Binary Symmetric Channel. Let us assume that two errors occur and that the received sequence is: [ y 0 , y 1 , y 2 , y 3 , y 4 ] = [10 , 01 , 11 , 01 , 11] J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  69. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Binary Symmetric Channel Diagram of the BSC 0 0 1 - p p p 1 - p 1 1 Characteristics of a Binary Symmetric Channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs (0, 1), two possible outputs (0, 1), 0, 1 are equally affected by errors (error probability p ) J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  70. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Binary Symmetric Channel Diagram of the BSC 0 0 1 - p p p 1 - p 1 1 Characteristics of a Binary Symmetric Channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs (0, 1), two possible outputs (0, 1), 0, 1 are equally affected by errors (error probability p ) J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  71. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Binary Symmetric Channel Diagram of the BSC 0 0 1 - p p p 1 - p 1 1 Characteristics of a Binary Symmetric Channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs (0, 1), two possible outputs (0, 1), 0, 1 are equally affected by errors (error probability p ) J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  72. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Binary Symmetric Channel Diagram of the BSC 0 0 1 - p p p 1 - p 1 1 Characteristics of a Binary Symmetric Channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs (0, 1), two possible outputs (0, 1), 0, 1 are equally affected by errors (error probability p ) J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  73. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of a Binary Symmetric Channel Calculation of the Branch Metric The transition probabilities are: p (1 | 0) = p (0 | 1) = p p (1 | 1) = p (0 | 0) = 1 − p The Hamming distance between the received value y and the coder output C rs (generated by the transition from state r to state s ) is the number of bits that differs between the two vectors. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  74. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of a Binary Symmetric Channel Calculation of the Branch Metric The transition probabilities are: p (1 | 0) = p (0 | 1) = p p (1 | 1) = p (0 | 0) = 1 − p The Hamming distance between the received value y and the coder output C rs (generated by the transition from state r to state s ) is the number of bits that differs between the two vectors. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  75. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of a Binary Symmetric Channel Calculation of the Branch Metric The Likelihood is written: � d H ( y , C rs ) � p (1 − p ) n p ( y | C rs ) = 1 − p � � p log p ( y | C rs ) = d H ( y , C rs ) log + n log (1 − p ) 1 − p � � p n log (1 − p ) is a constant and log < 0: maximizing 1 − p the likelihood is equivalent to minimizing d H ( y , C rs ). The Hamming branch metric d H ( y , C rs ) between the observation and the output of the FSM is adapted to the BS Channel. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  76. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of a Binary Symmetric Channel Calculation of the Branch Metric The Likelihood is written: � d H ( y , C rs ) � p (1 − p ) n p ( y | C rs ) = 1 − p � � p log p ( y | C rs ) = d H ( y , C rs ) log + n log (1 − p ) 1 − p � � p n log (1 − p ) is a constant and log < 0: maximizing 1 − p the likelihood is equivalent to minimizing d H ( y , C rs ). The Hamming branch metric d H ( y , C rs ) between the observation and the output of the FSM is adapted to the BS Channel. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  77. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of a Binary Symmetric Channel Calculation of the Branch Metric The Likelihood is written: � d H ( y , C rs ) � p (1 − p ) n p ( y | C rs ) = 1 − p � � p log p ( y | C rs ) = d H ( y , C rs ) log + n log (1 − p ) 1 − p � � p n log (1 − p ) is a constant and log < 0: maximizing 1 − p the likelihood is equivalent to minimizing d H ( y , C rs ). The Hamming branch metric d H ( y , C rs ) between the observation and the output of the FSM is adapted to the BS Channel. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  78. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Additive White Gaussian Noise Channel Diagram of the AWGN channel n k a k + n k a k Characteristics of an AWGN channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs ( − 1 , +1), real (or even complex)-valued output. The output is a superposition of the input and a Gaussian noise. 0, 1 are equally affected by errors J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  79. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Additive White Gaussian Noise Channel Diagram of the AWGN channel n k a k + n k a k Characteristics of an AWGN channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs ( − 1 , +1), real (or even complex)-valued output. The output is a superposition of the input and a Gaussian noise. 0, 1 are equally affected by errors J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  80. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Additive White Gaussian Noise Channel Diagram of the AWGN channel n k a k + n k a k Characteristics of an AWGN channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs ( − 1 , +1), real (or even complex)-valued output. The output is a superposition of the input and a Gaussian noise. 0, 1 are equally affected by errors J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  81. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Additive White Gaussian Noise Channel Diagram of the AWGN channel n k a k + n k a k Characteristics of an AWGN channel Memoryless channel: the output only depends on the present input (no internal state). Two possible inputs ( − 1 , +1), real (or even complex)-valued output. The output is a superposition of the input and a Gaussian noise. 0, 1 are equally affected by errors J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  82. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of an AWGN Channel Calculation of the Branch Metric The probability density function of a Gaussian noise is given by: − x 2 � � 1 p ( x ) = √ exp 2 σ 2 σ 2 π The Euclidian distance between the analog received value y and the coder output C rs (generated by the transition from state r to state s ) is the sum of square errors between the two vectors. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  83. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of an AWGN Channel Calculation of the Branch Metric The probability density function of a Gaussian noise is given by: − x 2 � � 1 p ( x ) = √ exp 2 σ 2 σ 2 π The Euclidian distance between the analog received value y and the coder output C rs (generated by the transition from state r to state s ) is the sum of square errors between the two vectors. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  84. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of an AWGN Channel Calculation of the Branch Metric The Likelihood is written: � � � n −� y − C rs � 2 � 1 √ p ( y | C rs ) = exp 2 σ 2 σ 2 π − d 2 � � E ( y , C rs ) ∝ exp 2 σ 2 Since log p ( y | C rs ) ∝ − d 2 E ( y , C rs ) 2 σ 2 maximizing the likelihood is equivalent to minimizing the Euclidian distance. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  85. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of an AWGN Channel Calculation of the Branch Metric The Likelihood is written: � � � n −� y − C rs � 2 � 1 √ p ( y | C rs ) = exp 2 σ 2 σ 2 π − d 2 � � E ( y , C rs ) ∝ exp 2 σ 2 Since log p ( y | C rs ) ∝ − d 2 E ( y , C rs ) 2 σ 2 maximizing the likelihood is equivalent to minimizing the Euclidian distance. J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

  86. Convolutional encoding Finite State Machine Binary Symmetric Channel Channel models Additive White Gaussian Noise Channel The Viterbi algorithm Branch Metric of an AWGN Channel Binary Symmetric Channel as an approximation of the Gaussian channel Note the BS channel is a coarse approximation of the AWGN channel with an error probability p given by : � + ∞ − x 2 � � 1 p = √ exp dx 2 σ 2 σ 2 π 1 J.-M. Brossier Coding and decoding with convolutional codes. The Viterbi Algorithm.

Recommend


More recommend