sum rate of gaussian multiterminal source coding
play

Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath - PowerPoint PPT Presentation

Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath University of Illinois, Urbana-Champaign March 19, 2003 Gaussian Multiterminal Source Coding y 1 = h t 1 n m 1 ENCODER 1 n 1 y 2 = h t 2 n m 2 D n 2 E ENCODER 2 C n O


  1. Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath University of Illinois, Urbana-Champaign March 19, 2003

  2. Gaussian Multiterminal Source Coding y 1 = h t 1 n m 1 ENCODER 1 n 1 y 2 = h t 2 n m 2 D n 2 E ENCODER 2 C ˆ n O D n = E R n N ENCODER K y K = h t m K K n Sum of rates of encoders R and distortion metric d ( n , ˆ n )

  3. Quadratic Gaussian CEO Problem y 1 = n + z 1 m 1 ENCODER 1 x 2 = n + z 2 m 2 D E ENCODER 2 C n ˆ n O D E R ENCODER K m K y K = n + z K Sum of rates of encoders R and distortion metric d ( n, ˆ n )

  4. Result: Quadratic Gaussian CEO n ) 2 • quadratic distortion metric d ( n, ˆ n ) = ( n − ˆ • For large number of encoders K , � + σ 2 + σ 2 σ 2 � � � R ( D ) = 1 2 log + n z n D − 1 2 σ 2 D n – Second term is loss w.r.t. cooperating encoders

  5. Outline • Problem Formulation: Tradeoff between sum rate R and distortion (metric d ( n , ˆ n )). • Main Result: Characterize a class of distortion metrics for which no loss in sum rate compared with encoder cooperation – A multiple antenna test channel

  6. Random Binning of Slepian-Wolf × ∆ × O O × y 1 × ∆ ∆ O O × × ∆ ∆ O × O ∆ ∆ × O y 2 O ∆ × O ∆ × • Rate is number of quantizers

  7. Encoding in Slepian-Wolf × ∆ × O O × y 1 × ∆ ∆ O O × × ∆ ∆ O × O ∆ ∆ × O y 2 O ∆ × O ∆ × • Quantizer closest to realization

  8. Decoding in Slepian-Wolf • Decoder knows joint distribution of y 1 , y 2 • It is given the two quantizer numbers from the encoders • Picks the pair of points in the quantizers which best matches the joint distribution – For jointly Gaussian y 1 , y 2 nearest neighbor type test • R 1 + R 2 = H ( y 1 , y 2 ) is sufficient for zero distortion

  9. Deterministic Broadcast Channel y 1 = f ( x ) x y 2 = g ( x ) • Pick distribution on x such that y 1 , y 2 have desired joint dis- tribtion (Cover 98)

  10. Slepian-Wolf code for Broadcast Channel • Encoding: implement Slepian-Wolf decoder – given two messages, find the appropriate pair y 1 , y 2 in the two quantizers – transmit x that generates this pair y 1 , y 2 .

  11. Slepian-Wolf code for Broadcast Channel • Encoding: implement Slepian-Wolf decoder – given two messages, find the appropriate pair y 1 , y 2 in the two quantizers – transmit x that generates this pair y 1 , y 2 . • Decoding: implement Slepian-Wolf encoder – quantize y 1 , y 2 to nearest point – messages are the quantizer numbers

  12. Lossy Slepian-Wolf Source Coding × ∆ × O O × u 1 × ∆ ∆ O O × × ∆ ∆ O × O ∆ ∆ × O u 2 O ∆ × O ∆ × • Approximate y 1 , y 2 by u 1 , u 2

  13. Lossy Slepian-Wolf Source Coding • Encoding: Find u i that matches source y i , separately for each i – For jointly Gaussian r.v. s, nearest neighbor calculation – Each encoder sends quantizer number containing the u picked

  14. Lossy Slepian-Wolf Source Coding • Encoding: Find u i that matches source y i , separately for each i – For jointly Gaussian r.v. s, nearest neighbor calculation – Each encoder sends quantizer number containing the u picked • Decoding: Reconstruct the u ’s picked by the encoders – reconstruction based on joint distribution of u ’s – Previously u i = y i were correlated – Here u ’s are independently picked

  15. Lossy Slepian-Wolf • We require p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ]

  16. Lossy Slepian-Wolf • We require p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ] • Generate ˆ n 1 , . . . , ˆ n K deterministically from reconstructed u ’s. • Need sum rate R sum = I ( u 1 , . . . , u K ; y 1 , . . . , y K ) . • Distortion equal to E [ d ( n , ˆ n )] .

  17. Marton Coding for Broadcast Channel w ∼ N (0 , I ) ˆ Dec1 u 1 u 1 , u 2 x y H t Enc ˆ Dec2 u 2 • Reversed encoding and decoding operations • Sum rate I ( u 1 , . . . , u K ; y 1 , . . . , y K ) . • No use for the Markov property p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ]

  18. Achievable Rates: Costa Precoding w m 1 u 1 ˆ m 1 x H t ˆ m 2 m 2 u 2 • Users’ data modulated onto spatial signatures u 1 , u 2

  19. Stage 1: Costa Precoding w m 1 u 1 ˆ m 1 Dec1 x H t m 2 u 2 • Encoding for user 1 treating signal from user 2 as known interference at transmitter

  20. Stage 2 w m 1 u 1 x H t Dec2 ˆ m 2 m 2 u 2 • Encode user 2 treating signal for user 1 as noise

  21. Adaptation to Lossy Slepian-Wolf z ∼ N (0 , K z ) ˆ u 1 u 1 , u 2 x y Dec H t Enc ˆ u 2 • Joint distribution of u ’s and y ’s depends on noise z • Performance independent of correlation in z

  22. Noise Coloring • Fix particular Costa coding scheme - fixes u ’s and x . • Idea: Choose z such that p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ] and ( K z ) ii = 1 • Then can adapt to Lossy Multiterminal Source Coding

  23. Markov Condition and Broadcast Channel • The Markov condition p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ] of independent interest in the broadcast channel

  24. Markov Condition and Broadcast Channel • The Markov condition p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ] of independent interest in the broadcast channel • p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K � u i | y 1 , . . . , y K , u 1 , u 2 , . . . , u i − 1 � i =1 p

  25. Markov Condition and Broadcast Channel • The Markov condition p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K i =1 p [ u i | y i ] of independent interest in the broadcast channel • p [ u 1 , . . . , u K | y 1 , . . . , y K ] = Π K � u i | y 1 , . . . , y K , u 1 , u 2 , . . . , u i − 1 � i =1 p • Equivalent to: given u 1 , . . . , u i − 1 u i − → y i − → y 1 , . . . , y i − 1 , y i +1 , . . . , y K

  26. Implication z ∼ N (0 , K z ) ˆ u 1 u 1 , u 2 x y Dec H t Enc ˆ u 2 • Need only y 1 to decode u 1 • Given u 1 , need only y 2 to decode u 2 Performance of Costa scheme equals that when receivers cooperate

  27. Markov Condition and Noise Covariance • The sum capacity is also achieved by such a scheme (CS 01, YC 01, VT 02, VJG 02) • For every Costa scheme, there is a choice of K z such that Markov condition holds (Yu and Cioffi, 01)

  28. Sum Capacity Cooperating z ∼ N (0 , K z ) Receivers Broadcast w Sato H t Dec H t Enc Enc Reciprocity Reciprocity w w Dec H H Enc x 1 , x 2 independent � x t K z x � ≤ P E Multiple Access Cooperating Transmitters

  29. Sum Capacity Cooperating z ∼ N (0 , K z ) Receivers Broadcast w Sato H t Dec H t Enc Enc Reciprocity Reciprocity w w Dec H H Enc Convex Duality x 1 , x 2 independent � � x t K z x ≤ P E Multiple Access Cooperating Transmitters

  30. Gaussian Multiterminal Source Coding y 1 = h t 1 n m 1 ENCODER 1 n 1 y 2 = h t 2 n m 2 D n 2 E ENCODER 2 C n ˆ O D n = E R n N ENCODER K y K = h t m K K n H = [ h 1 , . . . , h K ] Sum of rates of encoders R and distortion metric d ( n , ˆ n )

  31. Main Result • Distortion metric n ) = 1 n ) t � I + H diag { p 1 , . . . , p K } H t � N ( n − ˆ ( n − ˆ d ( n ; ˆ n ) – Here p 1 , . . . , p K - powers of users in reciprocal MAC • Rate distortion function R ( D ) = Sum rate of MAC − N log D = Sum rate of Broadcast Channel − N log D � I + HDH t � = log det − N log D

  32. Bells and Whistles • For quadratic distortion metric n ) = 1 n ) t ( n − ˆ N ( n − ˆ d ( n ; ˆ n ) set of H can be characterized • Analogy with CEO problem: For large number of encoders and random H characterization of R ( D ) almost surely

  33. Discussion • A “connection” made between coding schemes for multiter- minal source and channel coding

  34. Discussion • A “connection” made between coding schemes for multiter- minal source and channel coding • Connection somewhat superficial – relation between source coding and broadcast channel through a common random coding argument (PR 02, CC 02) – relation between source coding and multiple access chan- nel through a change of variable (VT 02, JVG 01)

  35. Discussion • A “connection” made between coding schemes for multiter- minal source and channel coding • Connection somewhat superficial – relation between source coding and broadcast channel through a common random coding argument – relation between source coding and multiple access chan- nel through a change of variable • Connection is suggestive – a codebook level duality

Recommend


More recommend