broadcasting on random networks
play

Broadcasting on Random Networks Anuran Makur, Elchanan Mossel, and - PowerPoint PPT Presentation

Broadcasting on Random Networks Anuran Makur, Elchanan Mossel, and Yury Polyanskiy EECS and Mathematics Departments Massachusetts Institute of Technology ISIT 2019 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10


  1. Motivation: Broadcasting on Trees Intuition: In tree T , layers grow exponentially with rate br( T ) and information contracts with rate (1 − 2 δ ) 2 . So, whichever effect wins determines reconstruction. If intuition correct, then broadcasting impossible on finite-dimensional grids, because layers grow polynomially. Can there be any graph with sub-exponentially growing layer sizes such that reconstruction possible? Surprise: Yes, and in fact, even logarithmic growth suffices (doubly-exponential reduction compared to trees (!)). But need nice loops to aggregate information. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 6 / 26

  2. Formal Model: Broadcasting on Bounded Indegree DAGs Fix infinite directed acyclic graph (DAG) with single source node. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 7 / 26

  3. Formal Model: Broadcasting on Bounded Indegree DAGs Fix infinite DAG with single source node. X k , j ∈ { 0 , 1 } – node random variable at j th position in level k level �,� �,� level �,� �,� level �,� �,� �,� �,� level �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 7 / 26

  4. Formal Model: Broadcasting on Bounded Indegree DAGs Fix infinite DAG with single source node. X k , j ∈ { 0 , 1 } – node random variable at j th position in level k L k – number of nodes at level k level �,� � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 7 / 26

  5. Formal Model: Broadcasting on Bounded Indegree DAGs Fix infinite DAG with single source node. X k , j ∈ { 0 , 1 } – node random variable at j th position in level k L k – number of nodes at level k d – indegree of each node level �,� � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 7 / 26

  6. Formal Model: Broadcasting on Bounded Indegree DAGs Fix infinite DAG with single source node. X k , j ∈ { 0 , 1 } – node random variable at j th position in level k L k – number of nodes at level k d – indegree of each node � 1 � X 0 , 0 ∼ Bernoulli level �,� � 2 Every edge is independent �,� BSC with crossover level �,� �,� � 0 , 1 � � probability δ ∈ . 2 level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 7 / 26

  7. Formal Model: Broadcasting on Bounded Indegree DAGs Fix infinite DAG with single source node. X k , j ∈ { 0 , 1 } – node random variable at j th position in level k L k – number of nodes at level k d – indegree of each node � 1 � X 0 , 0 ∼ Bernoulli level �,� � 2 Every edge is independent �,� BSC with crossover level �,� �,� � 0 , 1 � � probability δ ∈ . 2 level � Nodes combine inputs with �,� �,� �,� �,� d -ary Boolean functions. This defines joint distribution of { X k , j } . level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 7 / 26

  8. Broadcasting Problem Let X k � ( X k , 0 , . . . , X k , L k − 1 ). Can we decode X 0 from X k as k → ∞ ? level �,� � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 8 / 26

  9. Broadcasting Problem Let X k � ( X k , 0 , . . . , X k , L k − 1 ). Can we decode X 0 from X k as k → ∞ ? Binary Hypothesis Testing: Let ˆ X k ML ( X k ) ∈ { 0 , 1 } be maximum likelihood (ML) decoder with probability of error: � � P ( k ) X k ˆ ML � P ML ( X k ) � = X 0 , 0 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 8 / 26

  10. Broadcasting Problem Let X k � ( X k , 0 , . . . , X k , L k − 1 ). Can we decode X 0 from X k as k → ∞ ? Binary Hypothesis Testing: Let ˆ X k ML ( X k ) ∈ { 0 , 1 } be maximum likelihood (ML) decoder with probability of error: = 1 � � � � P ( k ) X k ˆ ML � P � � ML ( X k ) � = X 0 , 0 1 − � P X k | X 0 =1 − P X k | X 0 =0 . � TV 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 8 / 26

  11. Broadcasting Problem Let X k � ( X k , 0 , . . . , X k , L k − 1 ). Can we decode X 0 from X k as k → ∞ ? Binary Hypothesis Testing: Let ˆ X k ML ( X k ) ∈ { 0 , 1 } be maximum likelihood (ML) decoder with probability of error: = 1 � � � � P ( k ) X k ˆ ML � P � � ML ( X k ) � = X 0 , 0 1 − � P X k | X 0 =1 − P X k | X 0 =0 . � TV 2 By data processing inequality, TV distance contracts as k increases. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 8 / 26

  12. Broadcasting Problem Let X k � ( X k , 0 , . . . , X k , L k − 1 ). Can we decode X 0 from X k as k → ∞ ? Binary Hypothesis Testing: Let ˆ X k ML ( X k ) ∈ { 0 , 1 } be maximum likelihood (ML) decoder with probability of error: = 1 � � � � P ( k ) X k ˆ ML � P � � ML ( X k ) � = X 0 , 0 1 − � P X k | X 0 =1 − P X k | X 0 =0 . � TV 2 By data processing inequality, TV distance contracts as k increases. Broadcasting/Reconstruction possible if: ML < 1 k →∞ P ( k ) � � lim ⇔ lim � P X k | X 0 =1 − P X k | X 0 =0 TV > 0 � 2 k →∞ and Broadcasting/Reconstruction impossible if: ML = 1 k →∞ P ( k ) � � lim ⇔ lim � P X k | X 0 =1 − P X k | X 0 =0 TV = 0 . � 2 k →∞ A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 8 / 26

  13. Broadcasting Problem Let X k � ( X k , 0 , . . . , X k , L k − 1 ). Can we decode X 0 from X k as k → ∞ ? Binary Hypothesis Testing: Let ˆ X k ML ( X k ) ∈ { 0 , 1 } be maximum likelihood (ML) decoder with probability of error: = 1 � � � � P ( k ) X k ˆ ML � P � � ML ( X k ) � = X 0 , 0 1 − � P X k | X 0 =1 − P X k | X 0 =0 . � TV 2 By data processing inequality, TV distance contracts as k increases. Broadcasting/Reconstruction possible iff: ML < 1 k →∞ P ( k ) � � lim ⇔ lim � P X k | X 0 =1 − P X k | X 0 =0 TV > 0 . � 2 k →∞ For which δ , d , { L k } , and Boolean processing functions is reconstruction possible? A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 8 / 26

  14. Related Models in the Literature Communication Networks: Sender broadcasts single bit through network. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 9 / 26

  15. Related Models in the Literature Communication Networks: Sender broadcasts single bit through network. Reliable Computation and Storage: [vNe56, HW91, ES03, Ung07] Broadcasting model is noisy circuit to remember a bit using perfect gates and faulty wires. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 9 / 26

  16. Related Models in the Literature Communication Networks: Sender broadcasts single bit through network. Reliable Computation and Storage: Broadcasting model is noisy circuit to remember a bit using perfect gates and faulty wires. Probabilistic Cellular Automata: Impossibility of broadcasting on 2D regular grid parallels ergodicity of 1D probabilistic cellular automata. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 9 / 26

  17. Related Models in the Literature Communication Networks: Sender broadcasts single bit through network. Reliable Computation and Storage: Broadcasting model is noisy circuit to remember a bit using perfect gates and faulty wires. Probabilistic Cellular Automata: Broadcasting on 2D regular grid parallels 1D probabilistic cellular automata. Ancestral Data Reconstruction: Reconstruction on trees ⇔ Infer trait of ancestor from observed population. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 9 / 26

  18. Related Models in the Literature Communication Networks: Sender broadcasts single bit through network. Reliable Computation and Storage: Broadcasting model is noisy circuit to remember a bit using perfect gates and faulty wires. Probabilistic Cellular Automata: Broadcasting on 2D regular grid parallels 1D probabilistic cellular automata. Ancestral Data Reconstruction: Reconstruction on trees ⇔ Infer trait of ancestor from observed population. Ferromagnetic Ising Models: [BRZ95, EKPS00] Reconstruction impossible on tree ⇔ Free boundary Gibbs state of Ising model on tree is extremal. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 9 / 26

  19. Outline Introduction 1 Results on Random DAGs 2 Phase Transition for Majority Processing Impossibility Results for Broadcasting Phase Transition for NAND Processing Deterministic Broadcasting DAGs 3 Conclusion 4 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 10 / 26

  20. �,� level � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� Random DAG Model Fix { L k } and d > 1. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 11 / 26

  21. Random DAG Model Fix { L k } and d > 1. For each node X k , j , randomly and independently select d parents from level k − 1 (with repetition). This defines random DAG G . �,� level � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 11 / 26

  22. Random DAG Model Fix { L k } and d > 1. For each node X k , j , randomly and independently select d parents from level k − 1 (with repetition). This defines random DAG G . P ( k ) ML ( G ) – ML decoding probability of error for DAG G �,� level � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 11 / 26

  23. Random DAG Model Fix { L k } and d > 1. For each node X k , j , randomly and independently select d parents from level k − 1 (with repetition). This defines random DAG G . P ( k ) ML ( G ) – ML decoding probability of error for DAG G � L k − 1 1 σ k � j =0 X k , j – sufficient statistic of X k for σ 0 = X 0 , 0 L k in the absence of knowledge of G �,� level � �,� level �,� �,� � level � �,� �,� �,� �,� level � vertices �,� �,� �,� � �� �,� � �� A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 11 / 26

  24. ✶ Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 12 / 26

  25. Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d Suppose δ ∈ (0 , δ maj ). Then, there exists C ( δ, d ) > 0 such that if L k ≥ C ( δ, d ) log( k ), then reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 S k � ✶ � � is majority decoder. 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 12 / 26

  26. Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d Suppose δ ∈ (0 , δ maj ). Then, there exists C ( δ, d ) > 0 such that if L k ≥ C ( δ, d ) log( k ), then reconstruction possible: < 1 � � � � P ( k ) ˆ lim ML ( G ) ≤ lim sup S k � = X 0 , 0 k →∞ E P 2 k →∞ where ˆ σ k ≥ 1 S k � ✶ � � is majority decoder. 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 12 / 26

  27. Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d Suppose δ ∈ (0 , δ maj ). Then, there exists C ( δ, d ) > 0 such that if L k ≥ C ( δ, d ) log( k ), then reconstruction possible: < 1 � � � � P ( k ) ˆ lim ML ( G ) ≤ lim sup S k � = X 0 , 0 k →∞ E P 2 k →∞ where ˆ σ k ≥ 1 S k � ✶ � � is majority decoder. 2 δ maj , 1 � � Suppose δ ∈ . Then, there exists D ( δ, d ) > 1 such that if 2 � D ( δ, d ) k � L k = o , then reconstruction impossible: ML ( G ) = 1 k →∞ P ( k ) lim G - a . s . 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 12 / 26

  28. Proof Intuition Suppose d = 3 and δ maj = 1 6 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  29. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], level k has i.i.d. random bits i.i.d. X k , j ∼ majority(Bernoulli( σ ∗ δ ) , Bernoulli( σ ∗ δ ) , Bernoulli( σ ∗ δ )) where σ ∗ δ = σ (1 − δ ) + δ (1 − σ ) A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  30. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], level k has i.i.d. random bits i.i.d. X k , j ∼ majority(Bernoulli( σ ∗ δ ) , Bernoulli( σ ∗ δ ) , Bernoulli( σ ∗ δ )) where σ ∗ δ = σ (1 − δ ) + δ (1 − σ ), and L k − 1 � L k σ k = X k , j ∼ binomial( L k , E [ σ k | σ k − 1 = σ ] ) . j =0 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  31. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], level k has i.i.d. random bits i.i.d. X k , j ∼ majority(Bernoulli( σ ∗ δ ) , Bernoulli( σ ∗ δ ) , Bernoulli( σ ∗ δ )) where σ ∗ δ = σ (1 − δ ) + δ (1 − σ ), and L k − 1 � L k σ k = X k , j ∼ binomial( L k , g δ ( σ )) . j =0 Define the cubic polynomial: g δ ( σ ) � E [ σ k | σ k − 1 = σ ] = P ( X k , j = 1 | σ k − 1 = σ ) = ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ) . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  32. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], L k σ k ∼ binomial( L k , g δ ( σ )). Define the cubic polynomial g δ ( σ ) � ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ). Concentration: For large k , σ k ≈ g δ ( σ k − 1 ) given σ k − 1 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  33. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], L k σ k ∼ binomial( L k , g δ ( σ )). Define the cubic polynomial g δ ( σ ) � ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ). Concentration: For large k , σ k ≈ g δ ( σ k − 1 ) given σ k − 1 . Fixed Point Analysis: Case δ < δ maj : 1 � 1 2 0 0 1 1 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  34. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], L k σ k ∼ binomial( L k , g δ ( σ )). Define the cubic polynomial g δ ( σ ) � ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ). Concentration: For large k , σ k ≈ g δ ( σ k − 1 ) given σ k − 1 . Fixed Point Analysis: σ k “concentrates” at fixed point near X 0 , 0 Case δ < δ maj : 3 fixed points 1 � 1 2 0 0 1 1 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  35. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], L k σ k ∼ binomial( L k , g δ ( σ )). Define the cubic polynomial g δ ( σ ) � ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ). Concentration: For large k , σ k ≈ g δ ( σ k − 1 ) given σ k − 1 . Fixed Point Analysis: Case δ < δ maj : 3 fixed points Case δ > δ maj : 1 1 � � 1 1 2 2 0 0 0 1 1 0 1 1 2 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  36. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], L k σ k ∼ binomial( L k , g δ ( σ )). Define the cubic polynomial g δ ( σ ) � ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ). Concentration: For large k , σ k ≈ g δ ( σ k − 1 ) given σ k − 1 . Fixed Point Analysis: σ k → 1 2 a . s . if δ > δ maj and L k = ω (log( k )) Case δ < δ maj : 3 fixed points Case δ > δ maj : 1 fixed point 1 1 � � 1 1 2 2 0 0 0 1 1 0 1 1 2 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  37. Proof Intuition Suppose d = 3 and δ maj = 1 6 . Conditioned on σ k − 1 = σ ∈ [0 , 1], L k σ k ∼ binomial( L k , g δ ( σ )). Define the cubic polynomial g δ ( σ ) � ( σ ∗ δ ) 3 + 3( σ ∗ δ ) 2 (1 − σ ∗ δ ). Concentration: For large k , σ k ≈ g δ ( σ k − 1 ) given σ k − 1 . Converse uses key property : Lip( g δ ) ≤ 1 ⇔ g δ has unique fixed point. Case δ < δ maj : 3 fixed points Case δ > δ maj : 1 fixed point 1 1 � � 1 1 2 2 0 0 0 1 1 0 1 1 2 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 13 / 26

  38. Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d Suppose δ ∈ (0 , δ maj ). Then, there exists C ( δ, d ) > 0 such that if � � P ( k ) < 1 L k ≥ C ( δ, d ) log( k ), then lim ML ( G ) 2 . k →∞ E δ maj , 1 � � Suppose δ ∈ . Then, there exists D ( δ, d ) > 1 such that if 2 k →∞ P ( k ) D ( δ, d ) k � ML ( G ) = 1 � L k = o , then lim 2 G - a . s . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 14 / 26

  39. Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d Suppose δ ∈ (0 , δ maj ). Then, there exists C ( δ, d ) > 0 such that if � � P ( k ) < 1 L k ≥ C ( δ, d ) log( k ), then lim ML ( G ) 2 . k →∞ E δ maj , 1 � � Suppose δ ∈ . Then, there exists D ( δ, d ) > 1 such that if 2 k →∞ P ( k ) D ( δ, d ) k � ML ( G ) = 1 � L k = o , then lim 2 G - a . s . Remarks: δ maj = 1 6 for d = 3 appears in reliable computation [vNe56, HW91]. δ maj for odd d ≥ 3 also relevant in reliable computation [ES03]. δ maj for d ≥ 3 relevant in recursive reconstruction on trees [Mos98]. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 14 / 26

  40. Random DAG with Majority Processing Theorem (Phase Transition for d ≥ 3) Consider random DAG model with d ≥ 3 and majority processing (with 2 d − 2 ties broken randomly). Let δ maj � 1 2 − ⌈ d / 2 ⌉ ). ⌈ d / 2 ⌉ ( d Suppose δ ∈ (0 , δ maj ). Then, there exists C ( δ, d ) > 0 such that if � � P ( k ) < 1 L k ≥ C ( δ, d ) log( k ), then lim ML ( G ) 2 . k →∞ E δ maj , 1 � � Suppose δ ∈ . Then, there exists D ( δ, d ) > 1 such that if 2 k →∞ P ( k ) D ( δ, d ) k � ML ( G ) = 1 � L k = o , then lim 2 G - a . s . Questions: Broadcasting possible with sub-logarithmic L k ? Broadcasting possible when δ > δ maj with other processing functions? What about d = 2? A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 14 / 26

  41. Optimality of Logarithmic Layer Size Growth Broadcasting possible with sub-logarithmic L k ? Proposition (Layer Size Impossibility Result) For any deterministic DAG, if: log( k ) � , L k ≤ � 1 d log 2 δ then reconstruction impossible for all processing functions: ML = 1 k →∞ P ( k ) lim 2 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 15 / 26

  42. Optimality of Logarithmic Layer Size Growth Broadcasting possible with sub-logarithmic L k ? Proposition (Layer Size Impossibility Result) For any deterministic DAG, if: log( k ) � , L k ≤ � 1 d log 2 δ then reconstruction impossible for all processing functions: ML = 1 k →∞ P ( k ) lim 2 . No, broadcasting impossible with sub-logarithmic L k ! A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 15 / 26

  43. Partial Converse Results Broadcasting possible when δ > δ maj with other processing functions? Proposition (Single Vertex Reconstruction) Consider random DAG model with d ≥ 3. If δ ∈ (0 , δ maj ), L k ≥ C ( δ, d ) log( k ), and processing functions are majority, then single vertex reconstruction possible: P ( X k , 0 � = X 0 , 0 ) < 1 lim sup 2 . k →∞ A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 16 / 26

  44. Partial Converse Results Broadcasting possible when δ > δ maj with other processing functions? Proposition (Single Vertex Reconstruction) Consider random DAG model with d ≥ 3. If δ ∈ (0 , δ maj ), L k ≥ C ( δ, d ) log( k ), and processing functions are majority, then single vertex reconstruction possible: P ( X k , 0 � = X 0 , 0 ) < 1 lim sup 2 . k →∞ δ maj , 1 � � � d 2 k � If δ ∈ , d is odd, lim k →∞ L k = ∞ , and inf n ≥ k L n = O , then 2 single vertex reconstruction impossible for all processing functions (which may be graph dependent): �� � � lim � P X k , 0 | G , X 0 , 0 =1 − P X k , 0 | G , X 0 , 0 =0 = 0 . k →∞ E � � � TV Remark: Converse uses reliable computation results [HW91, ES03]. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 16 / 26

  45. Partial Converse Results Broadcasting possible when δ > δ maj with other processing functions? Proposition (Information Percolation [ES99, PW17]) For any deterministic DAG, if: � � δ > 1 1 1 √ 2 − and L k = o ((1 − 2 δ ) 2 d ) k 2 d then reconstruction impossible for all processing functions: ML = 1 k →∞ P ( k ) lim 2 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 16 / 26

  46. Partial Converse Results Broadcasting possible when δ > δ maj with other processing functions? Proposition (Information Percolation [ES99, PW17]) For any deterministic DAG, if: � � δ > 1 1 1 √ 2 − > δ maj and L k = o ((1 − 2 δ ) 2 d ) k 2 d then reconstruction impossible for all processing functions: ML = 1 k →∞ P ( k ) lim 2 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 16 / 26

  47. ✶ Random DAG with NAND Processing What about d = 2 ? Theorem (Phase Transition for d = 2) Consider random DAG model with d = 2 and NAND processing functions. √ Let δ nand � 3 − 7 . 4 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 17 / 26

  48. Random DAG with NAND Processing What about d = 2 ? Theorem (Phase Transition for d = 2) Consider random DAG model with d = 2 and NAND processing functions. √ Let δ nand � 3 − 7 . 4 Suppose δ ∈ (0 , δ nand ). Then, there exist C ( δ ) > 0 and t ( δ ) ∈ (0 , 1) such that if L k ≥ C ( δ ) log( k ), then reconstruction possible: < 1 � P ( k ) � � � ˆ lim ML ( G ) ≤ lim sup T 2 k � = X 0 , 0 k →∞ E P 2 k →∞ where ˆ T k � ✶ { σ k ≥ t ( δ ) } is thresholding decoder. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 17 / 26

  49. Random DAG with NAND Processing What about d = 2 ? Theorem (Phase Transition for d = 2) Consider random DAG model with d = 2 and NAND processing functions. √ Let δ nand � 3 − 7 . 4 Suppose δ ∈ (0 , δ nand ). Then, there exist C ( δ ) > 0 and t ( δ ) ∈ (0 , 1) such that if L k ≥ C ( δ ) log( k ), then reconstruction possible: < 1 � P ( k ) � � � ˆ lim ML ( G ) ≤ lim sup T 2 k � = X 0 , 0 k →∞ E P 2 k →∞ where ˆ T k � ✶ { σ k ≥ t ( δ ) } is thresholding decoder. δ nand , 1 � � Suppose δ ∈ . Then, there exist D ( δ ) , E ( δ ) > 1 such that if 2 � D ( δ ) k � L k = o and lim inf k →∞ L k > E ( δ ), then reconstruction impossible: ML ( G ) = 1 k →∞ P ( k ) lim G - a . s . 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 17 / 26

  50. Random DAG with NAND Processing What about d = 2 ? Theorem (Phase Transition for d = 2) Consider random DAG model with d = 2 and NAND processing functions. √ Let δ nand � 3 − 7 . 4 Suppose δ ∈ (0 , δ nand ). Then, there exist C ( δ ) > 0 and t ( δ ) ∈ (0 , 1) such that if L k ≥ C ( δ ) log( k ), then reconstruction possible: < 1 � P ( k ) � � � ˆ lim ML ( G ) ≤ lim sup T 2 k � = X 0 , 0 k →∞ E P 2 k →∞ where ˆ T k � ✶ { σ k ≥ t ( δ ) } is thresholding decoder. δ nand , 1 � � Suppose δ ∈ . Then, there exist D ( δ ) , E ( δ ) > 1 such that if 2 � D ( δ ) k � L k = o and lim inf k →∞ L k > E ( δ ), then reconstruction impossible: ML ( G ) = 1 k →∞ P ( k ) lim G - a . s . 2 Remark: δ nand appears in reliable computation [EP98, Ung07]. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 17 / 26

  51. Outline Introduction 1 Results on Random DAGs 2 Deterministic Broadcasting DAGs 3 Existence of DAGs where Broadcasting is Possible Construction of DAGs where Broadcasting is Possible Conclusion 4 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 18 / 26

  52. Existence of DAGs where Broadcasting is Possible Probabilistic Method: Random DAG broadcasting ⇒ DAG where reconstruction possible exists. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 19 / 26

  53. Existence of DAGs where Broadcasting is Possible Probabilistic Method: Random DAG broadcasting ⇒ DAG where reconstruction possible exists. For example: Corollary (Existence of Deterministic Broadcasting DAGs) For every d ≥ 3, δ ∈ (0 , δ maj ), and L k ≥ C ( δ, d ) log( k ), there exists DAG with majority processing functions such that reconstruction possible: ML < 1 k →∞ P ( k ) lim 2 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 19 / 26

  54. Existence of DAGs where Broadcasting is Possible Probabilistic Method: Random DAG broadcasting ⇒ DAG where reconstruction possible exists. For example: Corollary (Existence of Deterministic Broadcasting DAGs) For every d ≥ 3, δ ∈ (0 , δ maj ), and L k ≥ C ( δ, d ) log( k ), there exists DAG with majority processing functions such that reconstruction possible: ML < 1 k →∞ P ( k ) lim 2 . 0 , 1 � � Can we construct such DAGs for any δ ∈ ? 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 19 / 26

  55. � � Regular Bipartite Expander Graphs Proposition (Existence of Expander Graphs [Pin73, SS96]) For all (large) d and all sufficiently large n , there exists d -regular bipartite graph B n = ( U n , V n , E n ) with disjoint vertex sets U n , V n of cardinality | U n | = | V n | = n , edge multiset E n , and the lossless expansion property: � 2 � n ∀ S ⊆ U n , | S | = ⇒ | Γ( S ) | ≥ 1 − d | S | d 6 / 5 d 1 / 5 where Γ( S ) � { v ∈ V n : ∃ u ∈ S , ( u , v ) ∈ E n } is neighborhood of S . vertices � � A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 20 / 26

  56. Regular Bipartite Expander Graphs Proposition (Existence of Expander Graphs [Pin73, SS96]) For all (large) d and all sufficiently large n , there exists d -regular bipartite graph B n = ( U n , V n , E n ) with disjoint vertex sets U n , V n of cardinality | U n | = | V n | = n , edge multiset E n , and the lossless expansion property: � 2 � n ∀ S ⊆ U n , | S | = ⇒ | Γ( S ) | ≥ 1 − d | S | d 6 / 5 d 1 / 5 where Γ( S ) � { v ∈ V n : ∃ u ∈ S , ( u , v ) ∈ E n } is neighborhood of S . � � A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 20 / 26

  57. Regular Bipartite Expander Graphs Proposition (Existence of Expander Graphs [Pin73, SS96]) For all (large) d and all sufficiently large n , there exists d -regular bipartite graph B n = ( U n , V n , E n ) with disjoint vertex sets U n , V n of cardinality | U n | = | V n | = n , edge multiset E n , and the lossless expansion property: � 2 � n ∀ S ⊆ U n , | S | = ⇒ | Γ( S ) | ≥ 1 − d | S | d 6 / 5 d 1 / 5 where Γ( S ) � { v ∈ V n : ∃ u ∈ S , ( u , v ) ∈ E n } is neighborhood of S . Intuition: Expander graphs are sparse, but have high connectivity. � � A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 20 / 26

  58. Construction of DAGs where Broadcasting is Possible 0 , 1 � � Fix any δ ∈ and any sufficiently large odd d = d ( δ ). 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 21 / 26

  59. Construction of DAGs where Broadcasting is Possible 0 , 1 � � Fix any δ ∈ and any sufficiently large odd d = d ( δ ). 2 Fix L 0 = 1, L k = N for k ∈ { 1 , . . . , ⌊ M ⌋} where N = N ( δ ) sufficiently N / (4 d 12 / 5 ) � � large and M = exp , and ∀ r ≥ 1 , M 2 r − 1 < k ≤ M 2 r , L k = 2 r N such that L k = Θ(log( k )). A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 21 / 26

  60. Construction of DAGs where Broadcasting is Possible 0 , 1 � � Fix any δ ∈ and any sufficiently large odd d = d ( δ ). 2 Fix L 0 = 1, L k = N for k ∈ { 1 , . . . , ⌊ M ⌋} where N = N ( δ ) sufficiently N / (4 d 12 / 5 ) � � large and M = exp , and ∀ r ≥ 1 , M 2 r − 1 < k ≤ M 2 r , L k = 2 r N such that L k = Θ(log( k )). Construct bounded degree deterministic “expander DAG”: Each X 1 , j has one edge from X 0 , 0 . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 21 / 26

  61. Construction of DAGs where Broadcasting is Possible 0 , 1 � � Fix any δ ∈ and any sufficiently large odd d = d ( δ ). 2 Fix L 0 = 1, L k = N for k ∈ { 1 , . . . , ⌊ M ⌋} where N = N ( δ ) sufficiently N / (4 d 12 / 5 ) � � large and M = exp , and ∀ r ≥ 1 , M 2 r − 1 < k ≤ M 2 r , L k = 2 r N such that L k = Θ(log( k )). Construct bounded degree deterministic “expander DAG”: Each X 1 , j has one edge from X 0 , 0 . Case L k +1 = L k : Edge multiset X k → X k +1 given by expander B L k . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 21 / 26

  62. Construction of DAGs where Broadcasting is Possible 0 , 1 � � Fix any δ ∈ and any sufficiently large odd d = d ( δ ). 2 Fix L 0 = 1, L k = N for k ∈ { 1 , . . . , ⌊ M ⌋} where N = N ( δ ) sufficiently N / (4 d 12 / 5 ) � � large and M = exp , and ∀ r ≥ 1 , M 2 r − 1 < k ≤ M 2 r , L k = 2 r N such that L k = Θ(log( k )). Construct bounded degree deterministic “expander DAG”: Each X 1 , j has one edge from X 0 , 0 . Case L k +1 = L k : Edge multiset X k → X k +1 given by expander B L k . Case L k +1 = 2 L k : Both edge multisets X k → ( X k +1 , 0 , . . . , X k +1 , L k − 1 ) and X k → ( X k +1 , L k , . . . , X k +1 , L k +1 − 1 ) given by expander B L k . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 21 / 26

  63. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  64. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  65. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 level 𝑁 ‐ 1 𝐶 � level 𝑁 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  66. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 level 𝑁 ‐ 1 𝐶 � level 𝑁 level 𝑁 + 1 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  67. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 level 𝑁 ‐ 1 𝐶 � level 𝑁 𝐶 � level 𝑁 + 1 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  68. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 level 𝑁 ‐ 1 𝐶 � level 𝑁 𝐶 � 𝐶 � level 𝑁 + 1 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  69. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 level 𝑁 ‐ 1 𝐶 � level 𝑁 𝐶 � 𝐶 � level 𝑁 + 1 𝐶 �� level 𝑁 + 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  70. Construction of DAGs where Broadcasting is Possible Illustration of “Expander DAG”: level 0 level 1 𝐶 � level 2 level 𝑁 ‐ 1 𝐶 � level 𝑁 𝐶 � 𝐶 � level 𝑁 + 1 𝐶 �� level 𝑁 + 2 level 𝑁 � ‐ 1 𝐶 �� level 𝑁 � A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 22 / 26

  71. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  72. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proof Sketch: Suppose edges from level k to k + 1 given by expander B N . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  73. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proof Sketch: Suppose edges from level k to k + 1 given by expander B N . Let S k � { nodes equal to 1 at level k } . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  74. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proof Sketch: Suppose edges from level k to k + 1 given by expander B N . Let S k � { nodes equal to 1 at level k } . Call node at level k + 1 “bad” if it is connected to ≥ 1 + d 4 nodes in S k . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  75. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proof Sketch: Suppose edges from level k to k + 1 given by expander B N . Let S k � { nodes equal to 1 at level k } . Call node at level k + 1 “bad” if it is connected to ≥ 1 + d 4 nodes in S k . Expansion Property: If |S k | ≤ d − 6 / 5 N , then we have ≤ 8 d − 7 / 5 N “bad” nodes. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  76. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proof Sketch: Suppose edges from level k to k + 1 given by expander B N . Let S k � { nodes equal to 1 at level k } . Call node at level k + 1 “bad” if it is connected to ≥ 1 + d 4 nodes in S k . Expansion Property: If |S k | ≤ d − 6 / 5 N , then we have ≤ 8 d − 7 / 5 N “bad” nodes. Main Lemma: Given |S k | ≤ d − 6 / 5 N , we have |S k +1 | ≤ d − 6 / 5 N with high probability, as “good” nodes have low probability of becoming 1. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  77. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proof Sketch: Suppose edges from level k to k + 1 given by expander B N . Let S k � { nodes equal to 1 at level k } . Call node at level k + 1 “bad” if it is connected to ≥ 1 + d 4 nodes in S k . Expansion Property: If |S k | ≤ d − 6 / 5 N , then we have ≤ 8 d − 7 / 5 N “bad” nodes. Main Lemma: Given |S k | ≤ d − 6 / 5 N , we have |S k +1 | ≤ d − 6 / 5 N with high probability, as “good” nodes have low probability of becoming 1. If X 0 , 0 = 0, then |S k | likely to remain small as k → ∞ . A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  78. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proposition (Computational Complexity of DAG Construction) 0 , 1 � � For any δ ∈ , the d -regular bipartite expander graphs for levels 2 0 , . . . , k of “expander DAG” can be constructed in: deterministic quasi-polynomial time O ( exp( Θ(log( k ) log log( k )) ) ), Remark: Enumerate all d -regular bipartite graphs and test expansion. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

  79. Construction of DAGs where Broadcasting is Possible Theorem (Broadcasting in Expander DAG) For “expander DAG” with majority processing, reconstruction possible: < 1 � � ˆ lim sup S k � = X 0 , 0 P 2 k →∞ where ˆ σ k ≥ 1 � � S k = ✶ is majority decoder. 2 Proposition (Computational Complexity of DAG Construction) 0 , 1 � � For any δ ∈ , the d -regular bipartite expander graphs for levels 2 0 , . . . , k of “expander DAG” can be constructed in: deterministic quasi-polynomial time O ( exp( Θ(log( k ) log log( k )) ) ), randomized polylogarithmic time O ( log( k ) log log( k ) ) with positive success probability (which depends on δ but not k ). Remark: Generate uniform random d -regular bipartite graphs. A. Makur, E. Mossel, Y. Polyanskiy (MIT) Broadcasting on Random Networks 10 July 2019 23 / 26

Recommend


More recommend