lecture 6 channel coding over continuous channels
play

Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang - PowerPoint PPT Presentation

Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Bandlimited Gaussian Channel Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University


  1. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Bandlimited Gaussian Channel Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 10, 2015 1 / 77 I-Hsiang Wang IT Lecture 6

  2. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Bandlimited Gaussian Channel We have investigated the measures of information for continuous r.v.’s: The amount of uncertainty (entropy) is mostly infinite. Mutual information and KL divergences are well defined. Differential entropy is a useful entity to compute and manage measures of information for continuous r.v.’s. Question : How about coding theorems? Is there a general way or framework to extend coding theorems from discrete (memoryless) sources/channels to continuous (memoryless) sources/channels? 2 / 77 I-Hsiang Wang IT Lecture 6

  3. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 3 / 77 IT Lecture 6 Bandlimited Gaussian Channel Discrete Memoryless Channel x N y N Channel Channel Channel b w w Encoder p Y | X Decoder C ( B ) = X : E [ b ( X )] ≤ B I ( X ; Y ) . max ? Continuous Memoryless Channel x N y N Channel Channel Channel b w w f Y | X Encoder Decoder C ( B ) = sup I ( X ; Y ) . X : E [ b ( X )] ≤ B

  4. Channel Coding over Continuous Memoryless Channels entropy terms in the definitions of weakly typical sequences by I-Hsiang Wang 4 / 77 threshold decoder, similar to weak typicality. and Yeung[5] use weak typicality for continuous r.v.’s. Moser[4] uses capacity follows Gallager[2] and El Gamal&Kim[6] . Cover&Thomas[1] Using discretization to derive the achievability of Gaussian channel differential entropy terms. repeat the arguments in a similar way. In particular, replace the Parallel Gaussian Channel 2 New typicality : Extend weak typicality for continuous r.v. and finer to prove the achievability. create a discrete system, and then make the discretization finer and 1 Discretization : Discretize the source and channel input/output to theorems from the discrete world to the continuous world: Two main techniques for extending the achievability part of coding Coding Theorems: from Discrete to Continuous (1) Bandlimited Gaussian Channel IT Lecture 6

  5. Channel Coding over Continuous Memoryless Channels Cons : Technical; not much insight on how to achieve capacity. I-Hsiang Wang 5 / 77 lecture. Instead, you can find rigorous treatment in the references. Disclaimer : We will not be 100% rigorous in deriving the results in this achieve capacity. Hence, we use a geometric argument to provide insights on how to memoryless networks. Parallel Gaussian Channel Extends naturally to multi-terminal settings – can focus on discrete No need for new tools (eg., typicality) for continuous r.v.’s. Pros : In this lecture, we use discretization for the achievability proof. Coding Theorems: from Discrete to Continuous (2) Bandlimited Gaussian Channel IT Lecture 6

  6. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel Bandlimited Gaussian Channel Outline 1 First, we formulate the channel coding problem over continuous memoryless channels (CMC), state the coding theorem, and sketch the converse and achievability proofs. 2 Second, we introduce additive Gaussian noise (AGN) channel, derive the Gaussian channel capacity, and provide insights based on geometric arguments. 3 We then explore extensions, including parallel Gaussian channels and correlated Gaussian channels, and continuous-time bandlimited Gaussian channels. 6 / 77 I-Hsiang Wang IT Lecture 6

  7. Channel Coding over Continuous Memoryless Channels Parallel Channel with Independent Noises I-Hsiang Wang 7 / 77 Bandlimited Channel with Colored Gaussian Noise Bandlimited Channel with White Gaussian Noise 3 Bandlimited Gaussian Channel Parallel Channel with Colored Noises 2 Parallel Gaussian Channel Parallel Gaussian Channel Gaussian Channel Capacity Continuous Memoryless Channel 1 Channel Coding over Continuous Memoryless Channels Gaussian Channel Capacity Continuous Memoryless Channel Bandlimited Gaussian Channel IT Lecture 6

  8. Channel Coding over Continuous Memoryless Channels Parallel Channel with Independent Noises I-Hsiang Wang 8 / 77 Bandlimited Channel with Colored Gaussian Noise Bandlimited Channel with White Gaussian Noise 3 Bandlimited Gaussian Channel Parallel Channel with Colored Noises 2 Parallel Gaussian Channel Parallel Gaussian Channel Gaussian Channel Capacity Continuous Memoryless Channel 1 Channel Coding over Continuous Memoryless Channels Gaussian Channel Capacity Continuous Memoryless Channel Bandlimited Gaussian Channel IT Lecture 6

  9. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 9 / 77 same as those in channel coding over DMC. The definitions of error probability, achievable rate, and capacity, are the N 3 Average input cost constraint B : . 2 Continuous Memoryless Channel (CMC): IT Lecture 6 Bandlimited Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity Continous Memoryless Channel x N y N Channel Channel Channel b w w f Y | X Encoder Decoder 1 Input/output alphabet X = Y = R . Channel Law : Governed by the conditional density (p.d.f.) f Y | X . ( X k − 1 , Y k − 1 ) Memoryless : Y k − X k − ∑ N 1 k =1 b ( x k ) ≤ B , where b : R → [0 , ∞ ) is the (single-letter) cost function.

  10. Channel Coding over Continuous Memoryless Channels with input cost constraint B is I-Hsiang Wang 10 / 77 Converse proof : Exactly the same as that in the DMC case. In other words, it could also be discrete. Note : The input distribution of the r.v. X needs not to have a density. Parallel Gaussian Channel sup (1) IT Lecture 6 Channel Coding Theorem Bandlimited Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity The capacity of the CMC Theorem 1 (Continuous Memoryless Channel Capacity) ( ) R , f Y | X , R C = I ( X ; Y ) . X : E [ b ( X )] ≤ B How to compute h ( Y | X ) when X has no density? Recall [ ] ∫ h ( Y | X ) = E X − supp Y f ( y | X ) log f ( y | X ) dy , where f ( y | x ) is the conditional density of Y given X .

  11. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 11 / 77 apply the result in DMC with input cost: The proof of achievability makes use of discretization, so that one can IT Lecture 6 Gaussian Channel Capacity Sketch of the Achievability (1): Discretization Continuous Memoryless Channel Bandlimited Gaussian Channel x N y N b f Y | X w w ENC DEC

  12. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 12 / 77 apply the result in DMC with input cost: The proof of achievability makes use of discretization, so that one can IT Lecture 6 Gaussian Channel Capacity Sketch of the Achievability (1): Discretization Continuous Memoryless Channel Bandlimited Gaussian Channel b f Y | X w Q in Q out w ENC DEC Q in : (single-letter) discretization that maps X ∈ R to X d ∈ X d . Q out : (single-letter) discretization that maps Y ∈ R to Y d ∈ Y d . Note that both X d and Y d are discrete (countable) alphabets.

  13. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 13 / 77 as shown above. equivalent DMC apply the result in DMC with input cost: The proof of achievability makes use of discretization, so that one can IT Lecture 6 Sketch of the Achievability (1): Discretization Bandlimited Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity New ENC Equivalent DMC b f Y | X w Q in Q out w ENC DEC Q in : (single-letter) discretization that maps X ∈ R to X d ∈ X d . Q out : (single-letter) discretization that maps Y ∈ R to Y d ∈ Y d . Note that both X d and Y d are discrete (countable) alphabets. ( ) Idea : With the two discretization blocks Q in and Q out , one can build an X d , p Y d | X d , Y d

  14. Channel Coding over Continuous Memoryless Channels Parallel Gaussian Channel I-Hsiang Wang 14 / 77 4 Achievability in the original CMC : Prove that when the discretization channel coding theorem for DMC with input constraint, any rate 3 Achievability in the equivalent DMC : By the achievability part of the 1 Random codebook generation : Generate the codebook randomly IT Lecture 6 Bandlimited Gaussian Channel Continuous Memoryless Channel Gaussian Channel Capacity Sketch of the Achievability (2): Arguments Q in Q out x N y N d d Equivalent DMC b w w New ENC DEC p Y d | X d based on the original (continuous) r.v. X , satisfying E [ b ( X )] ≤ B . 2 Choice of discretization : Choose Q in such that the cost constraint will not be violated after discretization. Specifically, E [ b ( X d )] ≤ B . R < I ( X d ; Y d ) is achievable. in Q in and Q out gets finer and finer, I ( X d ; Y d ) → I ( X ; Y ) .

  15. Channel Coding over Continuous Memoryless Channels Parallel Channel with Independent Noises I-Hsiang Wang 15 / 77 Bandlimited Channel with Colored Gaussian Noise Bandlimited Channel with White Gaussian Noise 3 Bandlimited Gaussian Channel Parallel Channel with Colored Noises 2 Parallel Gaussian Channel Parallel Gaussian Channel Gaussian Channel Capacity Continuous Memoryless Channel 1 Channel Coding over Continuous Memoryless Channels Gaussian Channel Capacity Continuous Memoryless Channel Bandlimited Gaussian Channel IT Lecture 6

Recommend


More recommend