Basic Bounds and Gaussian MAC General Discrete Memoryless MAC Summary Lecture 7 Multiple Access Channel I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 10, 2014 1 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MAC Orthogonal: links can carry independent information without I-Hsiang Wang 2 / 49 single-hop multi-user noisy channels. far from valid. As a first step, we investigate several kinds of simple shared among multiple users, and hence the above two features may be Beyond wireline : in many scenarios, the communication medium is In other words, it well models the overlay network beyond PHY layer. interfering with one another. Noiseless: links are modeled as noiseless finite capacitated edges. General Discrete Memoryless MAC graphical multicast problem possesses such that its solution is so elegant: Caveat : There are couples of distinct features that a single noiseless Low-complexity explicit construction of network codes is possible. Zero-error decoding at finite blocklength is feasible. noiseless graphical networks. Moreover, in particular for single multicast, We have shown that Shannon’s paradigm can be readily extended to Summary NIT Lecture 7 Noiseless Graphical Network → Noisy Multi-User Network
Basic Bounds and Gaussian MAC Noiseless, Orthogonal I-Hsiang Wang 3 / 49 Interference channel Broadcast channel, Multiple access channel, unicast (MAC, BC) Special cases of multiple Single unicast/multicast; Traffic Main Example) General (Gaussian as Linkage General Discrete Memoryless MAC Single-hop General Topology noisy channel Single-hop multi-user multicast Noiseless graphical Lecture 7,8,9 Lecture 6 Comparison Summary NIT Lecture 7
Basic Bounds and Gaussian MAC General Discrete Memoryless MAC I-Hsiang Wang 4 / 49 (Lecture 8) one-hop model is the broadcast channel . 2 Broadcast of signals from transmitting terminals. The simplest NIT Lecture 7 (Lecture 7) 1 Superposition of signals at receiving terminals. The simplest one-hop model is the multiple access channel . Summary Key Features Missing in Noiseless Graphical Multicast Source 1 X 1 Encoder 1 Source 2 X 2 Y Encoder Multiple Access Decoder 2 Channel Source K Destination p ( y | x 1 , . . . , x K ) X K Encoder K Destination1 Y 1 Decoder 1 Source Destination 2 X Y 2 Broadcast Decoder Encoder Channel 2 Destination K p ( y 1 , . . . , y K | x ) Y K Decoder K
Basic Bounds and Gaussian MAC General Discrete Memoryless MAC I-Hsiang Wang 5 / 49 (Superposition) are superimposed together? 2 How does a single receiver decode multiple data streams when they (Traffic Pattern) single receiver? 1 How do multiple transmitters trade-off their rates when accessing a partially the following two kinds of questions: We shall start with the multiple access channel (MAC), which answers NIT Lecture 7 Summary Key Features Missing in Noiseless Graphical Multicast 3 Interference among independent information flows. The simplest one-hop model is the interference channel . (Lecture 9) Source 1 Destination1 Y 1 Encoder Decoder 1 1 Source 2 Destination 2 Y 2 Encoder Interference Decoder 2 Channel 2 Source K Destination K � y [1: K ] | x [1: K ] � p Y K Encoder Decoder K K
Basic Bounds and Gaussian MAC General Discrete Memoryless MAC I-Hsiang Wang 6 / 49 . 2 Channel: NIT Lecture 7 Summary Multiple Access Channel: Problem Formulation X 1 ENC 1 W 1 X 2 ENC 2 W 2 Y W 1 , . . . , c c p Y | X 1 ,...,X K DEC W K ...... ...... ...... X K ENC K W K 1 K independent messages { W 1 , . . . , W K } , each of which is only [ 1 : 2 NR k ] accessible by one encoder. W k ∼ Unif , ∀ k ∈ [1 : K ] . ( ) X 1 , . . . , X K , p Y | X 1 ,..., X K , Y 3 Rate tuple: ( R 1 , . . . , R K ) .
Basic Bounds and Gaussian MAC 4 A I-Hsiang Wang 7 / 49 that maps a K MAC channel code consists of General Discrete Memoryless MAC NIT Lecture 7 Multiple Access Channel: Problem Formulation Summary X 1 ENC 1 W 1 X 2 ENC 2 W 2 Y W 1 , . . . , c c p Y | X 1 ,...,X K DEC W K ...... ...... ...... X K ENC K W K ( ) 2 NR 1 , 2 NR 2 , . . . , 2 NR K , N [ 1 : 2 NR k ] ∀ k ∈ [1 : K ] , an encoding function enc k , N : → X N k that maps message w k to a length N codeword x N k . [ 1 : 2 NR k ] a decoding function dec N : Y N → × k =1 channel output y N to a reconstructed message tuple ( � w 1 , . . . , � w K ) .
Basic Bounds and Gaussian MAC e I-Hsiang Wang 8 / 49 . e MAC channel codes exist a sequence of . W K General Discrete Memoryless MAC NIT Lecture 7 Multiple Access Channel: Problem Formulation Summary X 1 ENC 1 W 1 X 2 ENC 2 W 2 Y W 1 , . . . , c c p Y | X 1 ,...,X K DEC W K ...... ...... ...... X K ENC K W K { ( )} 5 Error probability P ( N ) � W 1 , . . . , � ( W 1 . . . , W K ) ̸ = := Pr 6 A rate tuple R := ( R 1 , . . . , R K ) is said to be achievable if there ( ) 2 NR 1 , 2 NR 2 , . . . , 2 NR K , N such that P ( N ) → 0 as N → ∞ . { } R ∈ [0 , ∞ ) K : R is achievable 7 The capacity region C := cl
Basic Bounds and Gaussian MAC proving that the inner bound above is tight. I-Hsiang Wang 9 / 49 providing an enhanced achievability and a general converse proof. 4 Finally we characterize the capacity region for general MAC, by time-sharing, etc. schemes including successive interference cancellation (SIC), 3 We then use Gaussian MAC as an example to introduce various 2 Second we characterize the capacity region of Gaussian MAC by General Discrete Memoryless MAC two-user MAC, and establish a capacity inner bound. 1 First we extend the achievability of point-to-point channels to the straightforward manner. the results in the two-user case can be extended to the K -user case in a Lecture Overview Summary NIT Lecture 7 We mainly focus on the two-user case ( K = 2 ) in this lecture. For MAC,
Basic Bounds and Gaussian MAC General Discrete Memoryless MAC Summary 1 Basic Bounds and Gaussian MAC 2 General Discrete Memoryless MAC 3 Summary 10 / 49 I-Hsiang Wang NIT Lecture 7
Basic Bounds and Gaussian MAC General Discrete Memoryless MAC I-Hsiang Wang 11 / 49 which is reasonable since the two encoders are not cooperating. (3) (2) (1) Lemma 1 (Achievability) lies in how to analyze the error event in the appropriate way. Now the decoder has to decode two independent messages, and the key Let us begin with extending the achievability of point-to-point channel. Capacity Inner Bound Summary NIT Lecture 7 If ( R 1 , R 2 ) ≥ 0 satisfies the following for some ( X 1 , X 2 ) ∼ p X 1 · p X 2 , then ( R 1 , R 2 ) is achievable. R 1 < I ( X 1 ; Y | X 2 ) R 2 < I ( X 2 ; Y | X 1 ) R 1 + R 2 < I ( X 1 , X 2 ; Y ) Remark : Note that the input distribution is chosen such that X 1 ⊥ ⊥ X 2 ,
Basic Bounds and Gaussian MAC Our proof is divided into three parts (similar to the point-to-point case): I-Hsiang Wang 12 / 49 x N such that General Discrete Memoryless MAC , each i.i.d. over time (3) error probability analysis. (1) random codebook generation, (2) encoding and decoding, and NIT Lecture 7 e -codes such that Summary Proof of Achievability pf : As in the point-to-point case, we use random coding argument to prove the existence of sequence of ( ) 2 NR 1 , 2 NR 2 , N lim N →∞ P ( N ) = 0 as long as (1) – (3) hold. Random codebook generation : ∀ k = 1 , 2 , randomly and independently [ 1 : 2 NR k ] generate 2 NR k sequences x N k ( w k ) , w k ∈ k ∼ ∏ N i =1 p X k ( x k [ i ]) ) . according to p X k (that is, X N Encoding : ∀ k = 1 , 2 , to send message w k , Encoder k transmits x N k ( w k ) . [ 1 : 2 NR 1 ] [ 1 : 2 NR 2 ] Decoding : To facilitate error probability analysis, use typicality decoder: ( � w 1 , � w 2 ) = a unique ( w 1 , w 2 ) ∈ × ( 2 ( w 2 ) , y N ) ∈ T ( N ) 1 ( w 1 ) , x N ( X 1 , X 2 , Y ) . ϵ
Basic Bounds and Gaussian MAC t t t t , where General Discrete Memoryless MAC X N X N t t X N t X N 13 / 49 I-Hsiang Wang t NIT Lecture 7 t Summary Error Probability Analysis : By the symmetry of codebook generation, and focus on analyzing the “ averaged-over-codebook ” error probability we can assume WLOG the actual message tuple is ( W 1 , W 2 ) = (1 , 1) ( ) given ( W 1 , W 2 ) = (1 , 1) : ( E denotes the error event � W 1 , � ̸ = ( W 1 , W 2 ) ) W 2 P (1 , 1) {E} := Pr {E| ( W 1 , W 2 ) = (1 , 1) } . The key is to distinguish error event E into the following four cases E a , E (1) , E (2) , and E (1 , 2) , such that E = E a ∪ E (1) ∪ E (2) ∪ E (1 , 2) {( } 2 (1) , Y N ) ∈ T ( N ) E a := 1 (1) , X N / ϵ {( } 2 (1) , Y N ) E (1) ∈ T ( N ) := 1 ( w 1 ) , X N for some w 1 ̸ = 1 ϵ {( } 2 ( w 2 ) , Y N ) E (2) ∈ T ( N ) := 1 (1) , X N for some w 2 ̸ = 1 ϵ {( } 2 ( w 2 ) , Y N ) E (1 , 2) ∈ T ( N ) := 1 ( w 1 ) , X N for some w 1 ̸ = 1 , w 2 ̸ = 1 ϵ
Recommend
More recommend