on the role of interaction in network information theory
play

On the Role of Interaction in Network Information Theory Young-Han - PowerPoint PPT Presentation

On the Role of Interaction in Network Information Theory Young-Han Kim University of California, San Diego Banff Workshop on Interactive Information Theory January Networked Information Processing System Communication network


  1. On the Role of Interaction in Network Information Theory Young-Han Kim University of California, San Diego Banff Workshop on Interactive Information Theory January 

  2. Networked Information Processing System Communication network System: Internet, peer-to-peer network, sensor network, ... Sources: Data, speech, music, images, video, sensor data Nodes: Handsets, base stations, processors, servers, sensor nodes, ... Network: Wired, wireless, or a hybrid of the two Task: Communicate the sources, or compute/make decision based on them Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  3. Network Information Theory Communication network Network information flow questions: 㶳 What is the limit on the amount of communication needed? 㶳 What are the coding schemes/techniques that achieve this limit? Challenges: 㶳 Many networks inherently allow for two-way interactions 㶳 Most coding schemes are limited to one-way communications Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  4. Objectives of the Talk Review coding schemes that utilizes two-way interactions Focus on the channel coding side of the story (given yesterday’s talks) Draw mostly from a few classical examples and open problems (El Gamal–K ) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  5. Discrete Memoryless Channel (DMC) with Feedback ̂ M X i p ( y | x ) Y i M Encoder Decoder Y i − 1 Feedback does not increase the capacity of a DMC (Shannon ): C FB = max p ( x ) I ( X ; Y ) = C Nonetheless, feedback can help communication in several important ways 㶳 Feedback can simplify coding and improve reliability (Schalkwijk–Kailath ) 㶳 Feedback can increase the capacity of channels with memory (Butman ) 㶳 Feedback can enlarge the capacity region of DM multiuser channels (Gaarder–Wolf ) Insights on the fundamental limit of two-way interactive communication Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  6. Iterative Refinement Binary erasure channel: 1 − p 1 1 e X Y 0 0 1 − p Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  7. Iterative Refinement Binary erasure channel: 1 − p 1 1 e X Y 0 0 1 − p Basic idea: 㶳 First send a message at a rate higher than the channel capacity (without coding) 㶳 Then iteratively refine the receiver’s knowledge about the message Examples: 㶳 Schalkwijk–Kailath coding scheme () 㶳 Horstein’s coding scheme () 㶳 Posterior matching scheme (Shayevitz–Feder ) 㶳 Block feedback coding scheme (Weldon , Ahlswede , Ooi–Wornell ) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  8. Gaussian Channel with Feedback Z  X Y Expected average transmitted power constraint n 堈 E ( x 2 i ( m , Y i − 1 )) ≤ nP , m ∈ [ 1 : 2 nR ] i = 1 Schalkwijk–Kailath Coding Scheme (Schalkwijk–Kailath , Schalkwijk ): X 1 ∝ θ , X i ∝ θ − ̂ θ i − 1 ( Y i − 1 ) Doubly exponentially small probability of error Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  9. Posterior Matching Scheme (Shayevitz–Feder ) Recall the Schalkwijk–Kailath coding scheme: X 1 ∝ Θ ∼ N ( 0, 1 ) , X i ∝ Θ − ̂ Θ i − 1 ( Y i − 1 ) ∝ X i − 1 − E ( X i − 1 | Y i − 1 ) ⊥ Y i − 1 㶳 Y 1 , Y 2 , . . . are i.i.d. Consider a general DMC p ( y | x ) with a capacity-achieving input pmf p ( x ) : X 1 = F − 1 X ( F Θ ( Θ )) , Θ ∼ Unif [ 0, 1 ) X i = F − 1 X ( F Θ | Y і−1 ( Θ | Y i − 1 )) ⊥ Y i − 1 㶳 Y 1 , Y 2 , . . . are i.i.d. Generalizes repetition for BEC, S–K for Gaussian, and Horstein for BSC Actual proof involves properties of iterated random functions Question: Elementary proof (say, for BSC)? Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  10. Block Feedback Coding Scheme 1 − p Z ∼ Bern ( p ) 1 1 X Y X Y 0 0 1 − p Implementation of iterative refinement at the block level (Weldon ): 㶳 Initially, transmit k bits uncoded 㶳 Learn the error (via feedback), compress it using kH ( p ) bits, and transmit the compression index uncoded 㶳 Communicate the error about the error ( kH 2 ( p ) bits) 㶳 Communicate the error about the error about the error Achievable rate: k /( k + kH ( p ) + kH 2 ( p ) + kH 3 ( p ) + ⋅ ⋅ ⋅) = 1 − H ( p ) Extensions (Ahlswede , Ooi–Wornell ) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  11. Multiple Access Channel (MAC) with Feedback Y i − 1 M 1 X 1 i Encoder  M 1 , ̂ ̂ Y i p ( y | x 1 , x 2 ) M 2 Decoder M 2 X 2 i Encoder  Y i − 1 Transmission cooperation: x 1 i ( M 1 , Y i − 1 ) , x n 2 ( M 2 , Y i − 1 ) Capacity region C is not known in general Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  12. Example: Binary Erasure MAC X 1 ∈ { 0, 1 } Y ∈ { 0, 1, 2 } X 2 ∈ { 0, 1 } Capacity region without feedback: R 1 ≤ 1, R 2 ≤ 1, R 1 + R 2 ≤ 3 / 2 Block feedback coding scheme (Gaarder–Wolf ): 㶳 R sym = 2 / 3 : k uncoded transmissions + k / 2 one-sided retransmissions 㶳 R sym = 3 / 4 : k uncoded transmissions + k / 4 two-sided retransmissions + k / 16 + ⋅ ⋅ ⋅ 㶳 R sym = 0.7602 : k uncoded transmissions + k /( 2 log 3 ) cooperative retransmissions sym = 0.7911 (Cover–Leung , Willems ) R ∗ Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  13. Cover–Leung Coding Scheme Y i − 1 M 1 X 1 i Encoder  M 1 , ̂ ̂ p ( y | x 1 , x 2 ) Y i M 2 Decoder M 2 X 2 i Encoder  Y i − 1 Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  14. Cover–Leung Coding Scheme Y i − 1 X 1 i M 1 Encoder  M 1 , ̂ ̂ p ( y | x 1 , x 2 ) Y i M 2 Decoder M 2 X 2 i Encoder  Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  15. Cover–Leung Coding Scheme Y n ( j − 1 ) ̃ 1 ( j ) X n M 2, j − 1 , M 1 j Encoder  Y n ( j ) p ( y | x 1 , x 2 ) Decoder 2 ( j ) M 2, j − 1 , M 2 j X n Encoder  Block Markov coding Backward decoding (Willems–van der Meulen , Zeng–Kuhlmann–Buzo ) Willems condition (): Optimal when X 1 is a function of ( X 2 , Y ) Not optimal for the Gaussian MAC (Ozarow ) Question: Posterior matching for MAC? Question: Optimality of Cover–Leung for one-sided feedback? Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  16. Broadcast Channel (BC) with Feedback Y i − 1 1 ̂ Y 1 i M 1 Decoder  X i M 1 , M 2 p ( y 1 , y 2 | x ) Encoder ̂ Y 2 i M 2 Decoder  Y i − 1 2 Receivers operate separately (regardless of feedback) Physically degraded BC p ( y 1 | x ) p ( y 2 | y 1 ) : 㶳 Feedback does not enlarge the capacity region (El Gamal ) How can feedback help? Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  17. Dueck’s Example Z ∼ Bern ( 1 / 2 ) X 1 Y 1 = ( X 0 , X 1 ⊕ Z ) 怂 怒 怒 怒 怊 X 怒 X 0 怒 怒 怚 Y 2 = ( X 0 , X 2 ⊕ Z ) X 2 Capacity region without feedback: {( R 1 , R 2 ) : R 1 + R 2 ≤ 1 } Capacity region with feedback (Dueck ): {( R 1 , R 2 ) : R 1 ≤ 1, R 2 ≤ 1 } Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  18. Dueck’s Example Z i ∼ Bern ( 1 / 2 ) X 1 i Y 1 i = ( Z i − 1 , X 1 i ⊕ Z i ) → X 2, i − 1 Z i − 1 Y 2 i = ( Z i − 1 , X 2 i ⊕ Z i ) → X 1, i − 1 X 2 i Capacity region without feedback: {( R 1 , R 2 ) : R 1 + R 2 ≤ 1 } Capacity region with feedback (Dueck ): {( R 1 , R 2 ) : R 1 ≤ 1, R 2 ≤ 1 } Feedback helps by letting the encoder broadcast common channel information Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  19. Dueck’s Example Z i ∼ Bern ( 1 / 2 ) X 1 i Y 1 i = ( Z i − 1 , X 1 i ⊕ Z i ) → X 2, i − 1 Z i − 1 Y 2 i = ( Z i − 1 , X 2 i ⊕ Z i ) → X 1, i − 1 X 2 i Extension to general BC (Shayevitz–Wigger ) “Learn from the past, don’t predict the future” (Tse ) Gaussian BC: Schalkwijk–Kailath coding scheme to LQG control (Ozarow–Leung , Elia , Ardestanizadeh–Minero–Franceschetti ) Question: What’s going on with Gaussian? (Exactly why feedback helps?) Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

  20. Two-Way Channel ̂ M 1 X 1 i Y 2 i M 1 Encoder  Decoder  p ( y 1 , y 2 | x 1 , x 2 ) ̂ M 2 Y 1 i X 2 i M 2 Decoder  Encoder  Node  Node  The first multiuser channel model (Shannon ) Capacity region C is not known in general Main difficulties: 㶳 Two information flows share the same channel, inflicting interference to each other 㶳 Each node has to play two competing roles of communicating its own message and providing feedback to help the other node Two-way channel with common output: Y 1 = Y 2 = Y Young-Han Kim (UCSD) Role of Interactionin NIT Banff, January   / 

Recommend


More recommend