computing and communications 2 information theory
play

Computing and Communications 2. Information Theory -Gaussian - PowerPoint PPT Presentation

1896 1920 1987 2006 Computing and Communications 2. Information Theory -Gaussian Channel Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Gaussian Channel Parallel Gaussian


  1. 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Gaussian Channel Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1

  2. Outline • Gaussian Channel • Parallel Gaussian Channels • Bandlimited Channels 2

  3. Reference • Elements of information theory, T. M. Cover and J. A. Thomas, Wiley 3

  4. GAUSSIAN CHANNEL 4

  5. Gaussian Channel • A time-discrete channel: 𝑍 𝑗 = 𝑌 𝑗 + 𝑎 𝑗 – output 𝑍 𝑗 at time 𝑗 is sum of input 𝑌 𝑗 and noise 𝑎 𝑗 – noise 𝑎 𝑗 ~𝒪 0, 𝑂 according to central limit theorem • cumulative effect of a large number of small random effects – noise 𝑎 𝑗 is assumed to be independent of signal 𝑌 𝑗 – common communication channel • wired and wireless telephone channel, satellite links • If the noise variance is zero or the input is unconstrained, the capacity of the channel is infinite • Average power constraint for codeword (𝑦 1 , 𝑦 2 , … 𝑦 𝑜 ) 2 ≤ 𝑄 1 𝑜 – 𝑜 σ 𝑗=1 𝑦 𝑗 5

  6. Information Capacity 6

  7. Definitions 7

  8. Capacity of Gaussian Channel 8

  9. Sphere Packing Argument • Reason of being able to construct ( 2 𝑜𝐷 , 𝑜 ) codes with a low probability of error • Cannot hope to send at rates greater than C with low probability of error 9

  10. PARALLEL GAUSSIAN CHANNELS 10

  11. Parallel Gaussian Channels • 𝑙 independent Gaussian channels in parallel: 𝑍 𝑘 = 𝑌 𝑘 + 𝑎 𝑘 , 𝑘 = 1, 2, … , 𝑙, 𝑎 𝑘 ~𝒪 0, 𝑂 𝑘 – output of each channel is sum of input and Gaussian noise – noise is independent from channel to channel – each parallel component can represent a different frequency 2 ≤ 𝑄 𝑙 – a common power constraint: 𝐹 σ 𝑘=1 𝑌 𝑘 11

  12. Parallel Gaussian Channels • Information capacity: 𝐷 = max 2 ≤𝑄 𝐽(X 1 , … , 𝑌 𝑙 ; 𝑍 1 , … , 𝑍 𝑙 ) 𝑔 𝑦 1 ,…,𝑦 𝑙 :σ 𝐹𝑌 𝑗 12

  13. Information Capacity (Water-Filling) • The objective is to distribute the total power among the various channels to maximize the total capacity • Problem formulation • Optimal solution: 𝑄 𝑗 = (𝜉 − 𝑂 𝑗 ) + – ν is chosen s.t. σ(𝜉 − 𝑂 𝑗 ) + = 𝑄 and (𝑦) + denotes the positive part of 𝑦 13

  14. BANDLIMITED CHANNELS 14

  15. Bandlimited Channel • A continuous time channel bandlimited channel with white noise: Y 𝑢 = 𝑌 𝑢 + 𝑎(𝑢) ∗ ℎ(𝑢) – 𝑌 𝑢 is signal waveform – 𝑎(𝑢) is white Gaussian noise waveform – ℎ(𝑢) is an ideal bandpass filter impulse response, bandlimited to [-W, W] • Nyquist – Shannon sampling theorem – a bandlimited func. has only 2W degrees of freedom/sec 15

  16. Continuous/Discrete-time Channel • Nyquist rate of bandlimited signal in [-W, W] Hz is 2W samples per second (i.e., one sample every 1/(2W) sec) – use this result to understand the capacity of the continuous channel in terms of the corresponding discrete-time channel obtained by sampling at Nyquist rate • Power of continuous signal P is measured in watts (energy/sec), translating into average energy of P/2W per sample • Noise power is typically specified in terms of Power Spectrum Density (PSD), say N 0 /2 watts/Hz, translating to average noise energy of N 0 /2 per sample 16

  17. Capacity • Write the capacity of continuous time channel, with average input signal energy of P/2W per sample and average noise energy of N 0 /2 per sample, as • • Capacity of bandlimited Gaussian channel with noise spectral density N 0 /2 watts/Hz and power P watts 17

  18. Summary 18

  19. cuiying@sjtu.edu.cn iwct.sjtu.edu.cn/Personal/yingcui 19

Recommend


More recommend