Multi-Cast Channels with Hierarchical Flow Jonathan Ponniah San Jose State University Liang-Liang Xie University of Waterloo ISIT 2020
Outline • Previous Work • The One-Relay Channel • Flow Decomposition • Main Result • Overview of Proof • Conclusion 2
Timeline Capacity Theorems for the Relay Chanel El Gamal and Cover, 1973 Compress-Forward Decode-Forward One-Source Multi-Relay Channel Xie and Kumar, 2005 Multiple-Access Relay Channel Sankar and Kramer, 2007 One-Source Multi-Relay Channel Multi-Source Multi-Relay Multi-Cast Channels Yassaee and Aref, 2008 Xie and Kumar, 2007 Noisy Network Coding On Optimal Compressions Kim and El Gamal., 2011 Wu and Xie., 2013 Short-Length Noisy Network Coding Hou and Kramer, 2016 On Compress-Forward Schemes Flow Decomposition Ponniah, 2019 Ponniah, 2019 3
Timeline Capacity Theorems for the Relay Chanel El Gamal and Cover, 1973 Compress-Forward Decode-Forward Backward Decoding One-Source Multi-Relay Channel Xie and Kumar, 2005 Multiple-Access Relay Channel Sankar and Kramer, 2007 One-Source Multi-Relay Channel Multi-Source Multi-Relay Multi-Cast Channels Yassaee and Aref, 2008 Xie and Kumar, 2007 Noisy Network Coding On Optimal Compressions Kim and El Gamal., 2011 Wu and Xie., 2013 Short-Length Noisy Network Coding Hou and Kramer, 2016 On Compress-Forward Schemes Flow Decomposition Ponniah, 2019 Ponniah and Xie, 2019 4
Timeline Capacity Theorems for the Relay Chanel El Gamal and Cover, 1973 Compress-Forward Decode-Forward One-Source Multi-Relay Channel Xie and Kumar, 2005 Multiple-Access Relay Channel Sankar and Kramer, 2007 One-Source Multi-Relay Channel Multi-Source Multi-Relay Multi-Cast Channels Yassaee and Aref, 2008 Xie and Kumar, 2007 Noisy Network Coding On Optimal Compressions Kim and El Gamal., 2011 Wu and Xie., 2013 Short-Length Noisy Network Coding Regular Decoding Hou and Kramer, 2016 On Compress-Forward Schemes Flow Decomposition Ponniah, 2019 Ponniah and Xie, 2019 5
<latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> The One-Relay Channel Relay Source Destination 1 2 3 Theorem: For some the following rate is achievable: p ( x 1 ) p ( x 2 ) R < min { I ( X 1 ; Y 2 | X 2 ) , I ( X 1 X 2 ; Y 3 ) } .
<latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> Decode-Forward Codebook Generation: p ( x 1 ) p ( x 2 ) Channel Usage: divided into B blocks of n channel uses Encoding: • In block b the source encodes: m ( b ) ∈ { 1 , . . . , 2 nR } • In block b the relay encodes: m ( b − 1) Regular Decoding: • In block b the relay decodes: m ( b ) • In block b the destination decodes: m ( b − 1) Probability of Error: • Goes to zero at the relay if: R < I ( X 1 ; Y 2 | X 2 ) • Goes to zero at the destination: R < I ( X 2 ; Y 3 ) + I ( X 1 ; Y 3 | X 2 ) = I ( X 2 X 3 ; Y 3 )
<latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> <latexit sha1_base64="(nul)">(nul)</latexit> Decode-Forward Codebook Generation: p ( x 1 ) p ( x 2 ) Channel Usage: divided into B blocks of n channel uses Encoding: • In block b the source encodes: m ( b ) ∈ { 1 , . . . , 2 nR } • In block b the relay encodes: m ( b − 1) encoding delay Regular Decoding: • In block b the relay decodes: m ( b ) • In block b the destination decodes: m ( b − 1) Probability of Error: • Goes to zero at the relay if: R < I ( X 1 ; Y 2 | X 2 ) • Goes to zero at the destination: R < I ( X 2 ; Y 3 ) + I ( X 1 ; Y 3 | X 2 ) = I ( X 2 X 3 ; Y 3 )
Recommend
More recommend