and timing analysis for real time networks
play

and Timing Analysis for Real-Time Networks RTN 2018 Stefan Reif, - PowerPoint PPT Presentation

and Timing Analysis for Real-Time Networks RTN 2018 Stefan Reif, Timo Hnig, Wolfgang Schrder-Preikschat Department of Computer Science 4 (Distributed Systems and Operating Systems) Friedrich-Alexander-Universitt Erlangen-Nrnberg Andreas


  1. and Timing Analysis for Real-Time Networks RTN 2018 Stefan Reif, Timo Hönig, Wolfgang Schröder-Preikschat Department of Computer Science 4 (Distributed Systems and Operating Systems) Friedrich-Alexander-Universität Erlangen-Nürnberg Andreas Schmidt, Thorsten Herfet Telecommunications Lab Saarland Informatics Campus - Saarbrücken ∆ elta : Difgerential Energy-Effjciency, Latency,

  2. Motivation techniques reach their limits compile time 2 / 16 ▶ Modern CPN/IoT systems are complex ▶ Application → Image processing, ... ▶ Hardware → Multi-core, caches, ... ▶ Network → Internet, WiFi, ... ▶ Energy supply is limited ▶ Static timing and energy analysis ▶ Some aspects are unknown at ▶ Random environmental infmuences

  3. Motivation techniques reach their limits compile time Need for on-line analysis 2 / 16 ▶ Modern CPN/IoT systems are complex ▶ Application → Image processing, ... ▶ Hardware → Multi-core, caches, ... ▶ Network → Internet, WiFi, ... ▶ Energy supply is limited ▶ Static timing and energy analysis ▶ Some aspects are unknown at ▶ Random environmental infmuences

  4. The Predictably Reliable Real-Time Transport Protocol (PRRT) PRRT: 3 / 16 ▶ Soft real-time network protocol ▶ Latency awareness ▶ Congestion control ▶ Adaptive hybrid error control ▶ Bindings for C, Python, and others ▶ Source code is available: → http://prrt.larn.systems

  5. Background: X-Lap Time measurement: real-time network system 2. Execute experiment 3. Collect and combine traces 4. Visualize traces Timing information: 4 / 16 1. Insert timestamping points into Application Transport protocol Operating system ▶ Per-packet ▶ Cross-layer Channel ▶ Multi-node

  6. Background: X-Lap 2. Execute experiment Timing information: 4. Visualize traces 3. Collect and combine traces real-time network system 4 / 16 Time measurement: 1. Insert timestamping points into Application Transport protocol X-Lap Operating system ▶ Per-packet ▶ Cross-layer Channel ▶ Multi-node

  7. Approach Approach: evaluation Derive meaningful information Support protocol design Challenges: Identify critical code parts Latency, Jitter, Energy Analyze code modifjcations Automate feedback 5 / 16 ▶ Measurement-based protocol Evaluate Implement

  8. Approach Approach: evaluation Challenges: Identify critical code parts Latency, Jitter, Energy Analyze code modifjcations Automate feedback 5 / 16 ▶ Measurement-based protocol Evaluate ▶ Derive meaningful information ▶ Support protocol design Implement Analyze

  9. Approach Approach: evaluation Challenges: 5 / 16 ▶ Measurement-based protocol Evaluate ▶ Derive meaningful information ▶ Support protocol design □ Identify critical code parts → Latency, Jitter, Energy Implement Analyze □ Analyze code modifjcations □ Automate feedback

  10. Outline Introduction Timing Analysis and Automation Energy Effjciency Analysis Conclusion 6 / 16

  11. Outline Introduction Timing Analysis and Automation Energy Effjciency Analysis Conclusion 6 / 16

  12. Code segments E i E j Control-Flow Reconstruction events False-positives Thread communication Edges represent: relation of events 2. Compute happens-directly-before Control fmow reconstruction: timestamps Goal: 7 / 16 E 1 ▶ Reconstruct protocol information from E 2 1. Compute happens-before relation of E 4 E 3 E 5

  13. Code segments E i E j Control-Flow Reconstruction events False-positives Thread communication Edges represent: relation of events 2. Compute happens-directly-before Control fmow reconstruction: timestamps Goal: 7 / 16 E 1 ▶ Reconstruct protocol information from E 2 1. Compute happens-before relation of E 4 E 3 E 5

  14. Control-Flow Reconstruction timestamps Edges represent: relation of events 2. Compute happens-directly-before events Control fmow reconstruction: Goal: 7 / 16 E 1 ▶ Reconstruct protocol information from E 2 1. Compute happens-before relation of E 4 E 3 ▶ Code segments ⟨ E i , E j ⟩ ▶ Thread communication E 5 ▶ False-positives

  15. <PrrtSendEnd, LinkTransmitStart> <SendFeedbackStart, SendFeedbackEnd> <PrrtTransmitStart, LinkTransmitStart> <LinkReceive, DecodeStart> <LinkTransmitStart, LinkTransmitEnd> <PrrtSendStart, PrrtSubmitPackage> <DecodeEnd, HandlePacketStart> <SendFeedbackEnd, PrrtReturnPackage> <HandlePacketEnd, PrrtReceivePackage> <DecodeStart, DecodeEnd> <PrrtReturnPackage, HandlePacketEnd> <CopyOutputStart, CopyOutputEnd> <HandlePacketStart, SendFeedbackStart> <PrrtSubmitPackage, PrrtSendEnd> <CopyOutputEnd, PrrtDeliver> 0.0 0.1 0.2 0.3 0.4 0.5 0.6 latency criticality the segment latency and the E2E correlation 2. Ignore segments with negative latency Latency Criticality Analysis Detect relevant code segments: optimizing Goals: 8 / 16 ▶ Filter out false-positive segments ▶ Identify code segments worth 1. Compute the correlation between

  16. Latency Criticality Analysis Goals: correlation 2. Ignore segments with negative latency the segment latency and the E2E Detect relevant code segments: optimizing 8 / 16 <PrrtSendEnd, LinkTransmitStart> ▶ Filter out false-positive segments <SendFeedbackStart, SendFeedbackEnd> <PrrtTransmitStart, LinkTransmitStart> ▶ Identify code segments worth <LinkReceive, DecodeStart> <LinkTransmitStart, LinkTransmitEnd> <PrrtSendStart, PrrtSubmitPackage> <DecodeEnd, HandlePacketStart> <SendFeedbackEnd, PrrtReturnPackage> <HandlePacketEnd, PrrtReceivePackage> <DecodeStart, DecodeEnd> <PrrtReturnPackage, HandlePacketEnd> 1. Compute the correlation between <CopyOutputStart, CopyOutputEnd> <HandlePacketStart, SendFeedbackStart> <PrrtSubmitPackage, PrrtSendEnd> <CopyOutputEnd, PrrtDeliver> 0.0 0.1 0.2 0.3 0.4 0.5 0.6 latency criticality

  17. Timing Reproducibility Analysis Goals: unpredictable timing Timing reproducibility analysis: 1. Evaluate the code twice 2. Apply the k-sample Anderson-Darling-Test on the traces 3. For each code segment, test decides: “similar” or “difgerent” 9 / 16 ▶ Identify code segments with ▶ “What happens if ...” analysis

  18. Timing Reproducibility Analysis Timing reproducibility analysis: decides: “similar” or “difgerent” 3. For each code segment, test traces Anderson-Darling-Test on the 2. Apply the k-sample 1. Evaluate the code twice 9 / 16 unpredictable timing Goals: foobar(); ▶ Identify code segments with ▶ “What happens if ...” analysis ?? ... ...

  19. Timing Reproducibility Analysis Timing reproducibility analysis: decides: “similar” or “difgerent” 3. For each code segment, test traces Anderson-Darling-Test on the 2. Apply the k-sample 9 / 16 unpredictable timing Goals: foobar(); ▶ Identify code segments with ▶ “What happens if ...” analysis 1. Evaluate the code twice ?? ... ...

  20. 1. Evaluate the original and the modifjed Code Modifjcation Analysis Goals: Modifjcation impact analysis: versions of the protocol 2. Apply the k-sample Anderson-Darling-Test on the traces 3. For each code segment, test decides: “similar” or “difgerent” 10 / 16 foo(0); foo(1); ▶ Evaluate the infmuence of code changes ▶ Interferences are complex

  21. 1. Evaluate the original and the modifjed Code Modifjcation Analysis Modifjcation impact analysis: “similar” or “difgerent” 3. For each code segment, test decides: Anderson-Darling-Test on the traces 2. Apply the k-sample versions of the protocol 10 / 16 Goals: foo(0); foo(1); ▶ Evaluate the infmuence of code changes ▶ Interferences are complex ?? ... ...

  22. Code Modifjcation Analysis Modifjcation impact analysis: “similar” or “difgerent” 3. For each code segment, test decides: Anderson-Darling-Test on the traces 2. Apply the k-sample versions of the protocol 10 / 16 Goals: foo(0); foo(1); ▶ Evaluate the infmuence of code changes ▶ Interferences are complex 1. Evaluate the original and the modifjed ?? ... ...

  23. Code Modifjcation Analysis <PrrtSendPacketStart,PrrtSendPacketEnd> has not changed <CopyOutputStart,CopyOutputEnd> has not changed <SendFeedbackStart,SendFeedbackEnd> has not changed ... <PrrtSubmitPackage,PrrtSendEnd> has changed: <LinkReceive,DecodeStart> has changed: ... 11 / 16 2.85±0.91 µs → 4.94±0.58 µs 2.36±3.80 µs → 3.56±4.88 µs

  24. Outline Introduction Timing Analysis and Automation Energy Effjciency Analysis Conclusion 12 / 16

  25. 0.30 0.25 0.20 Per-package ET² (nJs²) 0.15 0.10 0.05 0.00 1 2 3 Frequency (GHz) Energy effjciency evaluation: energy demand confjgurations Energy Effjciency Analysis confjgurations Goals: 13 / 16 ▶ Identify energy-effjcient hardware 1. Evaluate slow and fast → Measure execution times and 2. Compute ET 2 metric per packet

  26. Energy Effjciency Analysis Goals: energy demand confjgurations Energy effjciency evaluation: confjgurations 13 / 16 0.30 ▶ Identify energy-effjcient hardware 0.25 0.20 Per-package ET² (nJs²) 0.15 0.10 1. Evaluate slow and fast 0.05 → Measure execution times and 0.00 1 2 3 2. Compute ET 2 metric per packet Frequency (GHz)

Recommend


More recommend