FROM ALMOST GAUSSIAN TO GAUSSIAN Max H. M. Costa and Olivier Rioul Unicamp and Télécom-ParisTech 22/09/2014 MaxEnt 2014 – Amboise, France
Summary Gaussian Interference Channel - standard form Brief history Z-Interference channel Degraded Interference channel Corner points of capacity region Upper Bound Lower bound Discussion
Standard Gaussian Interference Channel Power P1 ^ W 1 W 1 a b ^ W 2 W 2 Power P2
Z-Gaussian Interference Channel
The possibilities: Things that we can do with interference: Ignore (take interference as noise (IAN) 1. Avoid (divide the signal space (TDM/FDM)) 2. Partially decode both interfering signals 3. Partially decode one, fully decode the other 4. Fully decode both (only good for strong inter- 5. ference , a≥1)
Brief history Carleial (1975): Very strong interference does not reduce capacity (a 2 ≥ 1+P) Sato (1981), Han and Kobayashi (1981): Strong interference (a 2 ≥ 1) : IFC behaves like 2 MACs Motahari, Khandani (2009), Shang, Kramer and Chen (2009), Annapureddy, Veeravalli (2009): Very weak interference (2a(1+a 2 P) ≤ 1 ) : Treat interference as noise (IAN)
History (continued) Sason (2004): Symmetrical superposition to beat TDM – found part of optimal choice for α Etkin, Tse, Wang (2008): capacity to within 1 bit, good heuristical choice of α P=1/ a 2
Degraded Gaussian Interference Channel
Differential capacity Discrete time channel as a band limited channel
Gaussian Broadcast Channel
Superposition coding (1- )P N 2 P P 1
Superposition coding (1- )P P N 2 P 1
Multiple Access Channel
Degraded Interference Channel - One Extreme Point
Degraded Interference Channel - Another Extreme Point
Degraded Gaussian Interference Channel
Key variables Let Z1 + Z2 + X2 be distributed as f Note: X2 is a codebook Let Z1 + Z2 + Z3 be distributed as g Z1, Z2, Z3 are Gaussian variables Have: h(g) – h(f) ≤ 𝑜 1 (the almost Gaussian hypothesis)
Key variables (cont.) Y1 = X1 + Z1 Y2 = X1 + Z1 + Z2 + X2 Y3 = X1 + Z1 + Z2 + Z3 X1 ~ p Y2 ~ f•p Y3 ~ g•p
The missing inequality Need a Fano type inequality based on non-disturbance criterion: -n ≤ h(Y3) – h(Y2) ≤ n (with diminishing )
Upper bound on h(Y3) – h(Y2) I(X1;Y2) = I(X1;Y2|X2) – I(X1;X2|Y2) ≥ I(X1;Y2|X2 ) – n 2 ≥ H(X1) – H(X1| X1+Z1+Z2) – n 2 = I(X1;X1+Z1+Z2) – n 2 ≥ I(X1;Y3) – n 2 By the data processing inequality (DPI). Therefore h(Y3)- h(Y2) ≤ h(Y3|X1) -h(Y2|X1) + n 2 = h(g) – h(f) + n 2 ≤ n 1 + n 2
Lower Bound on h(Y3) – h(Y2) h(g) = - g log g - g log f h(f) = - f log f D(f||g) D(g||f) - f log g Smoothing by p: - g•p log f•p h(Y3) = - g •p log g •p - f•p log g•p h(Y2) = - f•p log f• p By DPI: 0 ≤ D(f•p || g•p) ≤ D(f ||g ) ≤ n 1 0 ≤ D( g•p || f•p) ≤ D(g||f) ≤ n 1
Lower Bound (cont.) Conjecture: We argue by continuity that ( f•p - g•p ) log f•p does not change sign. This implies: h(Y3) - h(Y2) ≥ -2 1
Rational 0 ≤ D( g•p || f•p ) = ( g•p log g•p - ( g•p log f•p + ( f•p log f•p - ( f•p log f•p = h ( f•p ) – h ( g•p ) + ( f•p – g•p ) log f•p ≤ D( g||f) ≤ ( f – g ) log f ≤ 2n 1 Equivalently h (Y3) - h (Y2) ≥ ( f•p - g•p ) log f•p + ( g - f ) log f ≥ -2n 1
Special case Let f = g + f. Then expand 0 ≤ D( f•p || g•p) ≤ ( f•p log f•p - ( f•p log g•p + ( g•p log g•p - ( g•p log g•p ≤ h ( g•p ) – h ( f•p ) + ( g•p – f•p ) log g•p ≤ h ( Y3 ) – h ( Y2 ) + f •p log g•p = g - f = 2g – f is also a valid density, then If 𝑔 can prove the lower bound by symmetry and upper bound.
Remarks Somewhat surprisingly, h(Y2) can be greater then h(Y3). Close to establish the corner points of the capacity region of the standard interference channel. To whisper or to shout: Not to cause inconvenience, X1 needs to be decoded at Y2. Better to shout!
Many thanks!
Recommend
More recommend