 
              Multiple layers and symbol-level mapping U  b    X ϕ     b  b  U  b    ∙ Natural mapping: X = α ( U  +  U  ) ∙ Gray mapping: X = α ( U  U  +  U  ) /
Multiple layers and symbol-level mapping U  b    X ϕ     b  b  U  b    ∙ Natural mapping: X = α ( U  +  U  ) ∙ Gray mapping: X = α ( U  U  +  U  ) ∙ Similar mapping ϕ exists for higher-order PAM, QPSK, QAM, PSK, MIMO, ... X QPSK = 倂 P exp 急 i π ( U  U  +  U  ) 怵  /
Multiple layers and symbol-level mapping U  b    X ϕ     b  b  U  b    ∙ Natural mapping: X = α ( U  +  U  ) ∙ Gray mapping: X = α ( U  U  +  U  ) ∙ Similar mapping ϕ exists for higher-order PAM, QPSK, QAM, PSK, MIMO, ... X QPSK = 倂 P exp 急 i π ( U  U  +  U  ) 怵  ∙ Can be many-to-one (still information-lossless)    ∙ Can induce nonuniform X (Gallager )  /
Horizontal superposition coding U n M   X n ϕ U n M   /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) 㶳 Find a unique m  such that ( u n  ( m  ) , u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; U  , Y ) /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) 㶳 Find a unique m  such that ( u n  ( m  ) , u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; U  , Y ) 㶳 Combined rate: R  + R  < I ( U  ; Y , U  ) + I ( U  ; Y ) /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) 㶳 Find a unique m  such that ( u n  ( m  ) , u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; U  , Y ) 㶳 Combined rate: R  + R  < I ( U  ; Y , U  ) + I ( U  ; Y ) = I ( U  , U  ; Y ) /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) 㶳 Find a unique m  such that ( u n  ( m  ) , u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; U  , Y ) 㶳 Combined rate: R  + R  < I ( U  ; Y , U  ) + I ( U  ; Y ) = I ( U  , U  ; Y ) = I ( X ; Y ) /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) 㶳 Find a unique m  such that ( u n  ( m  ) , u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; U  , Y ) 㶳 Combined rate: R  + R  < I ( U  ; Y , U  ) + I ( U  ; Y ) = I ( U  , U  ; Y ) = I ( X ; Y ) 㶳 Regardless of ϕ or the decoding order /
Horizontal superposition coding U n M   X n ϕ U n M   ∙ Broadcast channels (Cover ), fading channels (Shamai–Steiner ) ∙ Successive cancellation decoding: 㶳 Find a unique m  such that ( u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; Y ) 㶳 Find a unique m  such that ( u n  ( m  ) , u n  ( m  ) , y n ) is jointly typical: R  < I ( U  ; U  , Y ) 㶳 Combined rate: R  + R  < I ( U  ; Y , U  ) + I ( U  ; Y ) = I ( U  , U  ; Y ) = I ( X ; Y ) 㶳 Regardless of ϕ or the decoding order ∙ Multi-level coding (MLC): Wachsmann–Fischer–Huber () /
Vertical superposition coding U n  X n M ϕ U n  /
Vertical superposition coding U n  X n M ϕ U n  ∙ Single codeword of length  n : C  n = ( C n , C  n n +  ) C n 㨃→ U n n +  㨃→ U n C  n   /
Vertical superposition coding U n  X n M ϕ U n  ∙ Single codeword of length  n : C  n = ( C n , C  n n +  ) C n 㨃→ U n n +  㨃→ U n C  n   ∙ Treating the other layer as noise: /
Vertical superposition coding U n  X n M ϕ U n  ∙ Single codeword of length  n : C  n = ( C n , C  n n +  ) C n 㨃→ U n n +  㨃→ U n C  n   ∙ Treating the other layer as noise: 㶳 Find a unique m  such that ( u n  ( m ) , y n ) is jointly typical ( u n  ( m ) , y n ) is jointly typical and /
Vertical superposition coding U n  X n M ϕ U n  ∙ Single codeword of length  n : C  n = ( C n , C  n n +  ) C n 㨃→ U n n +  㨃→ U n C  n   ∙ Treating the other layer as noise: 㶳 Find a unique m  such that ( u n  ( m ) , y n ) is jointly typical ( u n  ( m ) , y n ) is jointly typical and 㶳 Successful w.h.p. if R < I ( U  ; Y ) + I ( U  ; Y ) /
Vertical superposition coding U n  X n M ϕ U n  ∙ Single codeword of length  n : C  n = ( C n , C  n n +  ) C n 㨃→ U n n +  㨃→ U n C  n   ∙ Treating the other layer as noise: 㶳 Find a unique m  such that ( u n  ( m ) , y n ) is jointly typical ( u n  ( m ) , y n ) is jointly typical and 㶳 Successful w.h.p. if R < I ( U  ; Y ) + I ( U  ; Y ) < I ( U  , U  ; Y ) = I ( X ; Y ) /
Vertical superposition coding U n  X n M ϕ U n  ∙ Single codeword of length  n : C  n = ( C n , C  n n +  ) C n 㨃→ U n n +  㨃→ U n C  n   ∙ Treating the other layer as noise: 㶳 Find a unique m  such that ( u n  ( m ) , y n ) is jointly typical ( u n  ( m ) , y n ) is jointly typical and 㶳 Successful w.h.p. if R < I ( U  ; Y ) + I ( U  ; Y ) < I ( U  , U  ; Y ) = I ( X ; Y ) ∙ Bit-interleaved coded modulation (BICM): Caire–Taricco–Biglieri () /
Diagonal superposition coding U n  X n M ϕ U n  /
Diagonal superposition coding U n M 㰀㰀  X n ϕ M 㰀 U n  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j ) Block        U  U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) C  n U  C n (  ) U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) C  n C  n U  C n (  ) C n (  ) U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n U  C n (  ) C n (  ) C n (  ) U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) U  /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) U  ∙ Sliding-window decoding: /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) U  ∙ Sliding-window decoding: R < I ( U  ; U  , Y ) + I ( U  ; Y ) = I ( X ; Y ) /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) U  ∙ Sliding-window decoding: R < I ( U  ; U  , Y ) + I ( U  ; Y ) = I ( X ; Y ) ∙ Block Markov coding: Used extensively in relay and feedback communication /
Diagonal superposition coding M ( j −  ) U n  X n ϕ M ( j ) U n  ∙ Think outside the block: Sequence of messages M ( j ) mapped to C  n ( j )        Block n +  (  ) n +  (  ) n +  (  ) n +  (  ) n +  (  ) C  n C  n C  n C  n C  n U  C n (  ) C n (  ) C n (  ) C n (  ) C n (  ) U  ∙ Sliding-window decoding: R < I ( U  ; U  , Y ) + I ( U  ; Y ) = I ( X ; Y ) ∙ Block Markov coding: Used extensively in relay and feedback communication ∙ Sliding-window coded modulation (SWCM): Kim et al. (), Wang et al. () /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures: /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures: 㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures: 㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures: 㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) 㶳 Vertical: Single-outer code (Foschini et al. ), but shouldn’t this be “V-BLAST”? /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures: 㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) 㶳 Vertical: Single-outer code (Foschini et al. ), but shouldn’t this be “V-BLAST”? ∙ Signal layers can be far more general than antenna ports /
Multiple-antenna transmission ∙ Consider the signal layers U  and U  as antenna ports: X = ( U  , U  ) ∙ Bell Laboratories Layered Space-Time (BLAST) architectures: 㶳 Horizontal: H-BLAST (Foschini et al. /), also known as V-BLAST 㶳 Diagonal: D-BLAST (Foschini ) 㶳 Vertical: Single-outer code (Foschini et al. ), but shouldn’t this be “V-BLAST”? ∙ Signal layers can be far more general than antenna ports ∙ Coded modulation can encompass MIMO transmission U  U  U  U  /
Comparison /
Comparison Horizontal U  M  U  M  Multi-level coding (MLC) R  < I ( U  ; Y ) R  < I ( U  ; U  , Y ) Short, nonuniversal /
Comparison Horizontal Vertical U  M  M U  M  M Multi-level coding (MLC) Bit-interleaved coded modulation (BICM) R  < I ( U  ; Y ) R < I ( U  ; Y ) + I ( U  ; Y ) R  < I ( U  ; U  , Y ) Short, nonuniversal Other layers as noise /
Comparison Horizontal Vertical Diagonal U  M  M M U  M  M M Multi-level coding (MLC) Bit-interleaved coded Sliding-window coded modulation (BICM) modulation (SWCM) R  < I ( U  ; Y ) R < I ( U  ; Y ) + I ( U  ; Y ) R < I ( U  ; U  , Y ) + I ( U  ; Y ) R  < I ( U  ; U  , Y ) = I ( X ; Y ) Short, nonuniversal Other layers as noise Error prop., rate loss /
BICM vs. SWCM 4 3.5 16PAM 3 8PAM 2.5 Symmetric Rate 2 4PAM 1.5 1 0.5 SWCM BICM 0 0 5 10 15 20 25 30 SNR(dB) LTE turbo code / ≤  -iteration LOG-MAP decoding at b =  , n =  , BLER =  .  /
BICM vs. SWCM 4 3.5 16PAM 3 8PAM 2.5 Symmetric Rate 2 4PAM 1.5 1 0.5 SWCM BICM 0 0 5 10 15 20 25 30 SNR(dB) LTE turbo code / ≤  -iteration LOG-MAP decoding at b =  , n =  , BLER =  .  /
Application: Interference channels desired signal interference /
Optimal rate region (Bandemer–El-Gamal–Kim ) X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  /
Optimal rate region (Bandemer–El-Gamal–Kim ) X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  R  < I ( X  ; Y  | X  ) R  + R  < I ( X  , X  ; Y  ) or R  < I ( X  ; Y  ) R  /
Optimal rate region (Bandemer–El-Gamal–Kim ) X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  R  < I ( X  ; Y  | X  ) R  + R  < I ( X  , X  ; Y  ) or R  < I ( X  ; Y  ) R  /
Optimal rate region (Bandemer–El-Gamal–Kim ) X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  R  < I ( X  ; Y  | X  ) R  + R  < I ( X  , X  ; Y  ) or R  < I ( X  ; Y  ) R  /
Low-complexity (implementable) alternatives R  X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  /
Low-complexity (implementable) alternatives R  X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  ∙ PP decoding /
Low-complexity (implementable) alternatives R  X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  ∙ PP decoding 㶳 Treating interference as (Gaussian) noise: R  < I ( X  ; Y  ) /
Low-complexity (implementable) alternatives R  X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  ∙ PP decoding 㶳 Treating interference as (Gaussian) noise: R  < I ( X  ; Y  ) 㶳 Successive cancellation decoding: R  < I ( X  ; Y  ) , R  < I ( X  ; Y  | X  ) /
Low-complexity (implementable) alternatives R  U  X  p ( y  | x  , x  ) Y  U  p ( y  | x  , x  ) Y  X  R  ∙ PP decoding 㶳 Treating interference as (Gaussian) noise: R  < I ( X  ; Y  ) 㶳 Successive cancellation decoding: R  < I ( X  ; Y  ) , R  < I ( X  ; Y  | X  ) ∙ + rate splitting (Zhao et al. , Wang et al. ) /
Low-complexity (implementable) alternatives R  X  p ( y  | x  , x  ) Y  p ( y  | x  , x  ) Y  X  R  ∙ PP decoding 㶳 Treating interference as (Gaussian) noise: R  < I ( X  ; Y  ) 㶳 Successive cancellation decoding: R  < I ( X  ; Y  ) , R  < I ( X  ; Y  | X  ) ∙ + rate splitting (Zhao et al. , Wang et al. ) ∙ Novel codes 㶳 Spatially coupled codes (Yedla, Nguyen, Pfister, and Narayanan ) 㶳 Polar codes (Wang and S ¸o˘ ¸as glu ) /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n   /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    U  U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    M  (  ) U  M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    M  (  ) M  (  ) U  M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    M  (  ) M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Sliding-window superposition coding (Wang et al. ) M  ( j −  ) U n  X n M  ( j ) → M  ( j ) M  ( j ) Y n  p ( y  | x  , x  ) U n   M  ( j ) → M  ( j ) M  ( j ) Y n p ( y  | x  , x  ) X n       Block    M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) U  M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) M  (  ) X  /
Recommend
More recommend