approximate message passing for unsourced access with
play

Approximate Message Passing for Unsourced Access with Coded - PowerPoint PPT Presentation

Approximate Message Passing for Unsourced Access with Coded Compressed Sensing Vamsi K. Amalladinne Asit Kumar Pradhan, Cynthia Rush , J.-F. Chamberland, Krishna R. Narayanan Electrical and Computer Engineering @ Texas A&M University


  1. Approximate Message Passing for Unsourced Access with Coded Compressed Sensing Vamsi K. Amalladinne Asit Kumar Pradhan, Cynthia Rush † , J.-F. Chamberland, Krishna R. Narayanan Electrical and Computer Engineering @ Texas A&M University † Statistics @ Columbia University ISIT 2020 June 6, 2020 This material is based upon work supported, in part, by NSF under Grants CCF-1619085 & CCF-1849883 This material is also based upon work support, in part, by Qualcomm Technologies, Inc., through their University Relations Program 1/ 21

  2. Uncoordinated and Unsourced MAC Message 1 Encoder Message 2 Encoder Message 3 Encoder MAC Joint Channel Decoder Message 4 Encoder Message 5 Encoder Message 6 Encoder Model Without Personalized Feedback ◮ All devices employ same encoder y = � i ∈ S a s i + z ◮ No explicit knowledge of identities where s i = f ( w i ) is codeword, ◮ Need only return unordered list only depends on message Y. Polyanskiy. A Perspective on Massive Random-Access . ISIT, 2017 2/ 21

  3. UMAC – Compressed Sensing Interpretation Information Bits (101010000) Message Index (21) Time → Columns Are Possible Signals ◮ Bit sequence w i ∈ { 0 , 1 } B converted to index s i in [0 , 2 B − 1] ◮ Stack codewords into N × 2 B sensing matrix with B ≈ 128 ◮ Message index determines transmitted codeword 3/ 21

  4. UMAC – Compressed Sensing with Multiple Messages Collection of Message Indices Conceptual MAC Framework ◮ Devices share same codebook (sensing matrix) ◮ Received signal is sum of K columns plus noise 4/ 21

  5. UMAC – Exact CS Analogy Received Signal = Message Indices Sampling Matrix, n × 2 B K -Sparse message vector Non-negative, integer entries ◮ y = As + z with � s � 0 = K ◮ Dimensionality of CS problem is huge ◮ Computational complexity of conventional CS solvers: O ( poly (2 B )) 5/ 21

  6. CCS: Divide and Conquer Information Bits FEC CS w 1 v 1 CS Algorithm Tree Decoder � FEC CS � W w 2 v 2 . . . . . . z FEC CS Noise w K v K CCS Components Encoding ◮ Split problem into CS pieces ◮ Get lists of fragments, one Partition list for every slot ◮ Stitch fragments together using tree decoder Distinct CS Instances 6/ 21

  7. Coded Compressive Sensing – Device Perspective B Bits + P Parity Bits · · · Allocating Parity Bits · · · Coupled Messages · · · Slot 1 Slot 2 Slot 3 Slot L ◮ Collection of L CS matrices and 1-sparse vectors ◮ Each CS generated signal is sent in specific time slot V. K. Amalladinne, A. Vem, D. Soma, K. R. Narayanan, J.-F. Chamberland. Coupled Compressive Sensing Scheme for Unsourced Multiple Access . ICASSP 2018 7/ 21

  8. Coded Compressive Sensing – Multiple Access Slot 1 Slot 2 Slot 3 Slot L · · · · · · · · · List 1 List 2 List 3 List L ◮ L instances of CS problem, each solved with non-negative LS ◮ Produces L lists of K decoded sub-packets (with parity) ◮ Must piece sub-packets together using tree decoder 8/ 21

  9. Coded Compressive Sensing – Stitching Process List 1 List 2 List 3 List L Tree Decoding Principles ◮ Every parity is linear combination of bits in preceding blocks ◮ Late parity bits offer better performance ◮ Early parity bits decrease decoding complexity 9/ 21

  10. Extending CCS Framework ◮ Alexander Fengler, Peter Jung, Giuseppe Caire on arXiv ◮ Connection between CCS indexing and sparse regression codes ◮ Circumvent slotting under CCS and dispersion effects 10/ 21

  11. UMAC – CCS Unified CS Analogy = Sampling Matrix, n × 2 ( B + P ) / L L -Sparse message vector ◮ Initial non-linear indexing step ◮ Index vector is block sparse ◮ Connection to sparse regression codes C. Rush, A. Greig, R. Venkataramanan. Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding . IEEE IT Trans 2017 11/ 21

  12. CCS-AMP = Sampling Matrix, n × 2 ( B + P ) / L L -Sparse message vector ◮ Complexity management comes from dimensionality reduction ◮ Use full sensing matrix on sparse regression codes ◮ Decode inner code with low-complexity AMP ◮ Decode outer code with tree decoding A. Fengler, P. Jung, and G. Caire. SPARCs and AMP for Unsourced Random Access . ISIT 2019 12/ 21

  13. Approximate Message Passing Algorithm Governing Equations ◮ AMP algorithm iterates through � r ( t ) � � r ( t ) � + 1 z ( t ) = y − AD η t n div D η t � �� � Onsager correction � r ( t ) � r ( t +1) = A T z ( t ) + D η t � �� � Denoiser Initial conditions z (0) = 0 and η 0 � r (0) � = 0 ◮ Application falls within framework for (non-separable) functions Tasks ◮ Define denoiser ◮ Derive correction term R. Berthier, A. Montanari, and P.-M. Nguyen. State Evolution for Approximate Message Passing with Non-Separable Functions . arXiv 2017 13/ 21

  14. Marginal Posterior Mean Estimate (PME) Proposed Denoiser (Fengler, Jung, and Caire) d ◮ r ( t ) − n →∞ Ds + τ t ζ − − → ◮ State estimate based on Gaussian model � � � s OR ( q , r , τ ) = E ˆ s | P ℓ s + τζ = r − ( r − √ P ℓ ) 2 � � q exp 2 τ 2 = − ( r − √ � P ℓ ) 2 � � − r 2 � (1 − q ) exp + q exp 2 τ 2 2 τ 2 with prior q = 1 − (1 − 1 / m ) K fixed � r ( t ) � s OR � � ◮ η t q , r ( t ) , τ t = ˆ is aggregate of PME values ◮ τ t is obtained from state evolution or τ 2 t ≈ � z ( t ) � 2 / n Performance is quite good! 14/ 21

  15. Marginal PME Revisited Enhanced CCS-AMP ◮ Can tree code inform decoding of inner code (AMP denoiser)? ◮ Idea: Propagate beliefs through q within PME existing framework � � � s OR ( q , r , τ ) = E ˆ s | P ℓ s + τζ = r − ( r − √ � P ℓ ) 2 � q exp 2 τ 2 = − ( r − √ P ℓ ) 2 � � � − r 2 � (1 − q ) exp + q exp 2 τ 2 2 τ 2 but leverage extrinsic information to compute q = Pr( s = 1) ◮ Proposed denoiser becomes � r ( t ) � s OR � � q ( t ) , r ( t ) , τ t η t = ˆ applied component-wise. 15/ 21

  16. Updated CCS-AMP Equations ◮ Onsager correction from divergence of η t ( r ) n div D η t ( r ) = z ( t − 1) 1 �� 1 − � D η t ( r ) � 2 � � � D 2 η t ( r ) � n τ 2 t ◮ Robust to tree dynamics ◮ Simplified AMP equations �� 2 � z ( t ) = y − ADs ( t ) + z ( t − 1) � D 2 s ( t ) � � Ds ( t ) � � � � � � 1 − � � n τ 2 t � A T z ( t ) + Ds ( t ) � s ( t +1) = η t +1 � r ( t ) � s OR � � q ( t ) , r ( t ) , τ t with η t = ˆ Tasks 1. Devise a suitable tree code 2. Compute q ( t ) from tree code 16/ 21

  17. Redesigning Outer Code Properties of Original Tree Code ◮ Aimed at stitching message fragments together ◮ Works on short lists of K fragments ◮ Parities allocated to control growth and complexity Challenges to Integrate into AMP 1. Must compute beliefs for all possible 2 v fragments 2. Must provide pertinent information to AMP 3. Should maintain ability to stitch outer code 17/ 21

  18. Redesigning Outer Code Solutions to Integrate into AMP ◮ Parity bits are generated over Abelian group amenable to Hadamard transform (original) or FFT (modified) ◮ Discrimination power proportional to # parities New Design Strategy 1. Information sections with parity bits interspersed in-between 2. Parity over two blocks (triadic dependencies) 3. Multiplicative effect across concentrated sections 18/ 21

  19. Redesigning Outer Code ◮ For a parity section ℓ :   �  � s ( t ) ( j , G − 1 Extrinsic Info q ( t +1) ( ℓ, k ) ∝ j ,ℓ ( g j ))  � j ∈W ℓ g j ≡ k j ∈W ℓ � �� � circular convolution structure where W ℓ = { j ∈ [1 : L ] : ∃ an edge between sections j & ℓ } . ◮ The vector of priors can be computed at once as q ( t +1) ( ℓ ) ∝ FFT − 1 �� � �� S ( t ) j ∈W ℓ FFT j , l where S ( t ) j , l is the stacked vector with entries s ( t ) ( j , G − 1 j ,ℓ ( g j )). v (13) v (14) v (15) v (16) v (1) v (2) v (4) v (5) v (7) v (8) v (10) v (11) v (3) v (6) v (9) v (12) 19/ 21

  20. Preliminary Performance Enhanced CCS 6 18 CCS-AMP original CCS-AMP original CCS-AMP enhanced CCS-AMP enhanced 5 Sparse IDMA Average run-time (sec) Required E b / N 0 (dB) 12 4 3 6 2 1 0 0 50 100 150 200 250 300 0 50 100 150 200 250 Number of active users K Number of active users K ◮ Performance improves with enhanced CCS-AMP decoding ◮ Computational complexity is approximately maintained ◮ Reparametrization may offer additional gains in performance? 20/ 21

  21. Discussion – Unsourced Multiple Access Channel Summary ◮ Introduced new framework for CCS-AMP and Unsourced MAC ◮ There are close connections between compressive sensing, graph-based codes, and UMAC ◮ Many theoretical and practical challenges/opportunities exist Thank You! This material is based upon work supported, in part, by NSF under Grants CCF-1619085 & CCF-1849883 This material is also based upon work support, in part, by Qualcomm Technologies, Inc., through their University Relations Program 21/ 21

Recommend


More recommend