unsourced multiuser sparse regression codes achieve the
play

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric - PowerPoint PPT Presentation

Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity Alexander Fengler Joint work with Peter Jung and Giuseppe Caire | Technische Universit at Berlin Communications and Information Theory Chair CommI Multiple Access


  1. Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity Alexander Fengler Joint work with Peter Jung and Giuseppe Caire | Technische Universit¨ at Berlin

  2. Communications and Information Theory Chair CommI Multiple Access - Uplink Typical mMTC specifications: – (very) large no. of potential users K tot ∼ ∞ – random access with sparse activity: K a ≪ K tot , – short messages of B bits – central BS – no cooperation between users Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 2

  3. Communications and Information Theory Chair CommI Setting Current solutions – Coordination by the BS: Identification by a unique pilot sequence and subsequent resource allocation (can be very wasteful for short messages and a large no. of inactive users) – Packet based communication with content resolution (Aloha and co.) (simple but suboptimal, ignores the nature of the channel) Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 3

  4. CommI Communications and Information Theory Chair Setting → ”Unsourced” (Polyanskiy 2017): – Each users employs the same codebook – Decoder recovers a list of codewords, up to permutation – Closer to the mMTC requirements but still information theoretic Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 4

  5. CommI Communications and Information Theory Chair Previous work Real AWGN channel without fading – Random coding achievability (Polyanskiy 2017) → existing schemes like TIN or ALOHA perform poorly compared to this bound – Several practical approaches – Reed-Muller code based: (Calderbank and Thompson 2018) – LDPC based: (Vem et al. 2017; Ustinova et al. 2019) – Polar code based: (Marshakov et al. 2019; Pradhan et al. 2019) Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 5

  6. Communications and Information Theory Chair CommI Previous work Real AWGN channel without fading – Random coding achievability (Polyanskiy 2017) → existing schemes like TIN or ALOHA perform poorly compared to this bound – Several practical approaches – Reed-Muller code based: (Calderbank and Thompson 2018) – LDPC based: (Vem et al. 2017; Ustinova et al. 2019) – Polar code based: (Marshakov et al. 2019; Pradhan et al. 2019) – Concatenated scheme: NNLS + Outer Tree Code (Amalladinne et al. 2018) Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 5

  7. Communications and Information Theory Chair CommI This work Real AWGN channel without fading – Last year: Sparse Regression based Unsourced Random Access (Fengler, Jung, and Caire 2019a) (with the outer tree code of (Amalladinne et al. 2018)) – This work: Refined analysis and closed form limits Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 6

  8. Communications and Information Theory Chair CommI Sparse Regression Coding (Barron and Joseph 2011) Each user encodes his LJ -bit message m into n real symbols in the following way: 1. Choose a codebook A = ( A 1 | ... | A L ) with A l = ( a l, 1 | ... | a l, 2 J ) ∈ R n × 2 J 2. Split m in L parts (sections): m = ( m 1 | ... | m L ) 3. Integer representation: → ( i m 1 | ... | i m L ) with i j ∈ [1 : 2 J ] (PPM) 4. Transmit: t = � L l =1 a l,i mi Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 7

  9. Communications and Information Theory Chair CommI Sparse Regression Coding (Barron and Joseph 2011) Each user encodes his LJ -bit message m into n real symbols in the following way: 1. Choose a codebook A = ( A 1 | ... | A L ) with A l = ( a l, 1 | ... | a l, 2 J ) ∈ R n × 2 J 2. Split m in L parts (sections): m = ( m 1 | ... | m L ) 3. Integer representation: → ( i m 1 | ... | i m L ) with i j ∈ [1 : 2 J ] (PPM) 4. Transmit: t = � L l =1 a l,i mi In matrix form: t = Ax with x = ( x 1 | ... | x L ) ⊤ , where x l ∈ { 0 , 1 } 2 J indicate the chosen column in section l . The columns of A are normalized such that E � t � 2 2 = nP Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 7

  10. Communications and Information Theory Chair CommI Channel Model Let K a active users transmit their messages in this way over an AWGN-Adder-MAC: � K a K a � Ax ( k ) + z = A � � x ( k ) y = + z (1) k =1 k =1 Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 8

  11. Communications and Information Theory Chair CommI Channel Model Let K a active users transmit their messages in this way over an AWGN-Adder-MAC: � K a K a � Ax ( k ) + z = A � � x ( k ) y = + z (1) k =1 k =1 Inner Channel: s → As + z (Sparse Recovery Problem) ( x (1) , ..., x ( K a ) ) → � x ( k ) Outer Channel: (Binary Input MAC) Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 8

  12. Communications and Information Theory Chair CommI Channel Model Let K a active users transmit their messages in this way over an AWGN-Adder-MAC: � K a K a � Ax ( k ) + z = A � � x ( k ) y = + z (1) k =1 k =1 Inner Channel: s → As + z (Sparse Recovery Problem) ( x (1) , ..., x ( K a ) ) → � x ( k ) Outer Channel: (Binary Input MAC) → MAC in the sparse domain, e.g. (Cohen, Heller, and Viterbi 1971) Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 8

  13. Communications and Information Theory Chair CommI Outer Channel Assume that the inner decoder recovers the support with symbol-wise error probabilities p fa = P (”0 → 1”) and p md = P (”1 → 0”) . This leads to the following OR-MAC model for the support: Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 9

  14. CommI Communications and Information Theory Chair Outer Channel Assume that the inner decoder recovers the support with symbol-wise error probabilities p fa = P (”0 → 1”) and p md = P (”1 → 0”) . This leads to the following OR-MAC model for the support: Assuming uniform iid messages the output entropy is well approximated (for the typical case K a ≪ 2 J ) by s ) = 2 J H 2 ((1 − p 0 )(1 − p md − p fa ) + p fa ) H (ˆ (2) Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 9

  15. Communications and Information Theory Chair CommI Outer Channel – Assume K a , J → ∞ with J = α log 2 K a for some α > 1 – and that p fa ≤ cK a / 2 J for some constant c , i.e. the false alarm rate does not dominate the sparsity asymptotically (o.w. the achievable rates go to zero), then: � � I ( x 1 , ..., x K a ;ˆ s ) 1 − 1 = (1 − p md ) lim (3) JK a α K a ,J →∞ – For p md = 0 this is achievable by the tree code of (Amalladinne et al. 2018), at exponential complexity, or up to a constant with a polynomial complexity. Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 10

  16. Communications and Information Theory Chair CommI Inner Channel 1. Approximate Message Passing (AMP)(Donoho, Maleki, and Montanari 2009) s t +1 = η t ( s t + A ⊤ z t ) (4) z t +1 = y − As t +1 + 2 J S n z t � t ( s t + A ⊤ z t ) � η ′ (5) (6) where η t ( r ) is applied componentwise and given by (Fengler, Jung, and Caire 2019b) � ˆ ��� − 1 � � � ˆ p 0 1 P P η t ( r i ) = 1 + exp − 2 r i (7) τ 2 1 − p 0 2 τ t t t = || z t || with τ 2 and p 0 = (1 − 2 − J ) K a . n 2. SBS-MAP i = 1 , ..., L 2 J MAP s ˆ = arg max P ( s i | y ) (8) i s i ∈{ 0 ,...,K a } Both can be analysed asymptotically by the RS-potential Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 11

  17. Communications and Information Theory Chair CommI Inner Channel 1. Approximate Message Passing (AMP)(Donoho, Maleki, and Montanari 2009) s t +1 = η t ( s t + A ⊤ z t ) (4) z t +1 = y − As t +1 + 2 J S n z t � t ( s t + A ⊤ z t ) � η ′ (5) (6) where η t ( r ) is applied componentwise and given by (Fengler, Jung, and Caire 2019b) � ˆ ��� − 1 � � � ˆ p 0 1 P P η t ( r i ) = 1 + exp − 2 r i (7) τ 2 1 − p 0 2 τ t t t = || z t || with τ 2 and p 0 = (1 − 2 − J ) K a . n 2. SBS-MAP i = 1 , ..., L 2 J MAP s ˆ = arg max P ( s i | y ) (8) i s i ∈{ 0 ,...,K a } Both can be analysed asymptotically by the RS-potential Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 11

  18. Communications and Information Theory Chair CommI Inner Channel 1. Approximate Message Passing (AMP)(Donoho, Maleki, and Montanari 2009) s t +1 = η t ( s t + A ⊤ z t ) (4) z t +1 = y − As t +1 + 2 J S n z t � t ( s t + A ⊤ z t ) � η ′ (5) (6) where η t ( r ) is applied componentwise and given by (Fengler, Jung, and Caire 2019b) � ˆ ��� − 1 � � � ˆ p 0 1 P P η t ( r i ) = 1 + exp − 2 r i (7) τ 2 1 − p 0 2 τ t t t = || z t || with τ 2 and p 0 = (1 − 2 − J ) K a . n 2. SBS-MAP i = 1 , ..., L 2 J MAP s ˆ = arg max P ( s i | y ) (8) i s i ∈{ 0 ,...,K a } Both can be analysed asymptotically by the RS-potential Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity | A.Fengler Page 11

Recommend


More recommend