multiple source multiple destination topology inference
play

Multiple Source Multiple Destination Topology Inference Destination - PowerPoint PPT Presentation

Multiple Source Multiple Destination Topology Inference Destination Topology Inference using Network Coding Pegah Sattari EECS, UC Irvine Joint work with Athina Markopoulou, at UCI, Christina Fragouli at EPFL Lausanne Christina Fragouli,


  1. Multiple Source Multiple Destination Topology Inference Destination Topology Inference using Network Coding Pegah Sattari EECS, UC Irvine Joint work with Athina Markopoulou, at UCI, Christina Fragouli at EPFL Lausanne Christina Fragouli, at EPFL, Lausanne

  2. Outline Outline o Network Tomography N k T h o Goal, Main Ideas, and Contributions o Proposed Approach o Conclusion

  3. Network Tomography Network Tomography o In general g – Goal: obtain a detailed picture of a network from end-to-end probes. – Infer what? Topology, Link-level (loss, delay). o Our goal: O l – “Topology inference”, multiple sources, multiple receivers, and intermediate nodes both network coding and multicast.

  4. Two bodies of related work Two bodies of related work Network Tomography Inference with Network Coding o Multicast trees using loss o Passive correlations correlations Failure patterns [Ho et al ISIT 05] Failure patterns [Ho et al., ISIT 05] – o Unicast probes Topology inference [Sharma et al., – ITA 07] o Active probing, reliance on the Bottleneck discovery/overlay number, order, delay variance – management in p2p [Jafarisiavoshani management in p2p [Jafarisiavoshani and loss of received probes, and d l f d b d et al., Sigcomm INM 07] heuristic or statistical signal- Subspace properties processing approach. – [Jafarisiavoshani et al., ITW 07] o Mostly related: Rabbat, Coates, y , , o o Active Active Nowak, “Multiple-Source Loss tomography [Gjoka et al., IEEE Internet Tomography,” IEEE – Globecom 07] JSAC 06. Binary tree inference [Fragouli et – al Allerton 06] al., Allerton 06]

  5. Main idea 1 Network coding: topology-dependent correlation Network coding: topology dependent correlation [Fragouli et al., 2006], [Sharma et al., 2007] o Network coding introduces o Network coding introduces x 1 x x 2 x topology-dependent correlation among the content of probe packets, which can be reverse- engineered to infer the engineered to infer the topology. – Network coding can make the packets “stay together” and packets stay together and reveal the coding point. x 1 +x 2

  6. Main idea 2 G General Graphs (DAG) l G hs (DAG) o An M-by-N DAG, with a given routing policy that has three properties: three properties: – A unique path from each source to each destination. – All 1-by-2 components: “inverted Y”. – All 2-by-1 components: “Y”. All 2 b 1 ts: “Y” o Consistent with the routing in the Internet. o Logical topology. g p gy S S 1 S 2 branching g joining J point point B R R R R 1 R R 2 Not a logical topology!

  7. Main Idea 2, Cont’d , 2-by-2 Components Rabbat et al., 2006 o A traditional multiple source, multiple receiver m p , m p tomography problem can be decomposed into multiple two source, two receiver sub-problems. o Four 2-by-2 types. y yp S 2 S 1 S 2 S 1 S 2 S 1 S 2 S 1 B 2 B 2 B 2 2 B 2 J 1 =J 2 =J J 2 J 1 B 1 B 1 B 1 1 1 1 B 1 =B 2 =B J 2 J 1 J 1 J 2 R 1 R 1 R 2 R 2 R 1 R 1 R 2 R 2 R 1 R 1 R 2 R 2 R 1 R 1 R 2 R 2 Type 1:shared Type 2:non-shared Type 3:non-shared Type 4:non-shared

  8. Main Idea 2, Cont’d , Decomposition into 2-by-2 S 1 S 2 S 1 S 2 S 1 S 2 S 1 S 2 B 2 B B B 2 B 2 B J 1 =J 2 J 1 =J 2 J 1 =J 2 J 1 =J 2 Decomposition B 1 2 B 1,2 B B 1,2 B B 1,2 B 2,3 B 2,3 J 3 J 3 J 3 3 R 1 R 3 R 2 R 3 R 1 R 2 R 1 R 2 R 3

  9. Previous Work 2-by-2’s and Merging y g g Rabbat et al.,2006 Rabbat et al.,2006 2 Δ 2 Δ S 2 S 1 1 S 1 S 2 1 2 1 1 random B 2 offset J J 8 B B 8,9 R 1 R 2 B 1 7 B 1,7 2 Δ 2 Δ 1 B 3,5 S 1 S 2 1 J 1 J 1 random random J 3 J 5 offset J 7 B 1,2 B 3,4 B 5,6 J J J R 1 R 2 R 3 R 4 R 5 R 6 R 7 R 8 R 9 R 1 R 3

  10. Weaknesses of Previous Work Weaknesses of Previous Work o In the 2 by 2 inference step they can only o In the 2-by-2 inference step, they can only distinguish between type 1 (shared) and types 2 3 4 (non-shared) types 2,3,4 (non shared). o This results in inaccurate identification of the joining point locations in the merging the joining point locations in the merging step. – I.e., bounds within a sequence of several consecutive logical links.

  11. Our Contributions Our Contributions o At the 2 by 2 inference step: o At the 2-by-2 inference step: – Network coding helps us distinguish among all four 2-by-2 types by looking at the content. o At the merging step: – Under the same assumption as in prior work (S 1 1 by N) we can localize each joining point for 1-by-N), we can localize each joining point, for each receiver, to a single logical link. – In addition, we can also design another merging algorithm, without such an assumption, and by l ith ith t h ti d b only using the 2-by-2 information.

  12. Outline Outline o Network Tomography o Network Tomography o Goal, Main Ideas, and Contributions o Proposed Approach P d A h – Assumptions, Node Operations – Step 1: 2-by-2 Components (lossless/lossy) Step 1: 2 by 2 Components (lossless/lossy) – Step 2: Merging Algorithms (two scenarios) – Simulation Results Simulation Results o Conclusion

  13. Assumptions Assumptions o Delay: o Delay: – fixed part (propagation) and random part (queuing); independent across links. o Packet loss: – both lossless and lossy cases. o Coarse synchronization (~5-10ms) across nodes o Coarse synchronization (~5-10ms) across nodes. – achievable via a handshaking scheme, e.g., NTP. o We design active probing schemes, i.e., the operation of sources, intermediate nodes and receivers, which allow topology inference from the observations observations.

  14. Node Operations Node Operations o Sources: synchronized – later relaxed by large time window W – in some algorithms, an artificial offset u – up to countMax experiments, spaced by time T. S 2 x 1 =[1,0] x 2 =[0,1] o Branching point: S 1 forwards the B 2 o Joining point: single received single received J 1 J adds and dd d packet to all forwards interested links B 1 packets downstream within W within W (the next hop J 2 (additions for at least one over Fq). source packet R 1 R 2 2 in the network code).

  15. Node Operations Node Operations o Sources: synchronized – later relaxed by large time window W – in some algorithms, an artificial offset u – up to countMax experiments, spaced by time T. S 2 x 1 =[1,0] x 2 =[0,1] o Branching point: S 1 forwards the B 2 o Joining point: single received single received J 1 J adds and dd d packet to all forwards interested links B 1 packets downstream within W within W (the next hop J 2 (additions for at least one over Fq). source packet R 1 R 2 2 in the network code).

  16. Node Operations Node Operations o Sources: synchronized – later relaxed by large time window W – in some algorithms, an artificial offset u – up to countMax experiments, spaced by time T. S 2 x 1 =[1,0] x 2 =[0,1] o Branching point: S 1 forwards the B 2 o Joining point: single received single received J J 1 adds and dd d packet to all forwards interested links B 1 packets downstream within W within W (the next hop J 2 (additions for at least one over Fq). source packet R 1 R 2 2 in the network c 11 x 1 +c 12 x 2 c 21 x 1 +c 22 x 2 code).

  17. Outline Outline o Network Tomography o Network Tomography o Goal, Main Ideas, and Contributions o Proposed Approach P d A h – Assumptions, Node Operations – Step 1: 2-by-2 Components (lossless/lossy) Step 1: 2 by 2 Components (lossless/lossy) – Step 2: Merging Algorithms (two scenarios) – Simulation Results Simulation Results o Conclusion

  18. Inferring 2-by-2’s No Loss Inferring 2 by 2 s, No Loss Distinguishing among {1,4}, 2 or 3 R 1 R 2 R 1 R 2 R 1 R 2 R 1 R 2 x 1 +x 2 x 1 +x 2 x 1 +x 2 x 1 +x 2 x 1 +2x 2 x 1 +2x 2 x 1 +x 2 x 1 +x 2 o o One probe distinguishes among Types: {1 4} 2 or 3 One probe distinguishes among Types: {1,4}, 2 or 3.

  19. Inferring 2-by-2’s No Loss Inferring 2 by 2 s, No Loss Distinguishing between 1,4 o o Type 1: J 1 =J 2 =J Type 1: J 1 =J 2 =J. o Type 4: J 1 ,J 2 different. o Can be achieved by Appropriately selecting u Appropriately selecting u.

  20. Inferring 2 by 2 s, No Loss Inferring 2-by-2’s No Loss Selecting the appropriate offset S 1 S 2 B 2 W-D 1 W-D 2 W-D 2 W-D 1 D D 1 D 2 B 1 0 W 0 W R 1 : R 1 : x 1 J 1 J 1 J 2 J 2 x 1 x 2 x 1 +x 2 x 1 +x 2 R 2 : R 2 : x 1 R 1 R 2 D 1 >D 2 , 2 , D 1 D 2 , D 1 <D 2 , Type (4) topology Type (4) topology 1 offset from [W-D 1 ,W-D 2 ] offset from [W-D 2 ,W-D 1 ] o o 2-by-2’s: u є [W-D 1 W-D 2 ] 2 by 2 s: u є [W D 1 ,W D 2 ] o More general: u є [0,W]

  21. Inferring 2-by-2’s Lossy Case Inferring 2 by 2 s, Lossy Case R 1 R 2 R 1 R 2 R 1 R 2 R 1 R 2 - x 1 x 1 x 1 +x 2 x 1 +x 2 x 1 x 1 x 1 +x 2 o meetings no longer guaranteed, observations no longer predictable! o There are common observations across all 4 types. o Each experiment might result in different outcomes.

  22. Inferring 2-by-2’s Lossy Case Inferring 2 by 2 s, Lossy Case All possible observations There are three groups of observations: (i) at least one receiver o does not receive any packet (-), (ii) R 1 = R 2 , (iii) R 1 ≠ R 2 .

  23. Inferring 2-by-2’s Lossy Case Inferring 2 by 2 s, Lossy Case Some observations of group (iii) help! E.g., c 12 -c 22 <0 can only occur for type 2 or 4! y yp o o c 12 -c 22 >0 can only occur for type 3 or 4, …

Recommend


More recommend