clouddet motivation visualization challenges
play

CloudDet Motivation Visualization Challenges CloudDet: Interactive - PowerPoint PPT Presentation

CloudDet Motivation Visualization Challenges CloudDet: Interactive Visual Analysis of Anomalous Performances in Cloud Monitoring nodes instead of monitoring applications Computing Systems Scale : Trade-off between system scalability


  1. CloudDet Motivation Visualization Challenges CloudDet: Interactive Visual Analysis of Anomalous Performances in Cloud • Monitoring nodes instead of monitoring applications Computing Systems • Scale : Trade-off between system scalability and level-of- • Too many false positives , scale detail(LoD) Ke Xu, Yun Wang, Leni Yang, Yifang Wang, Bo Qiao, Si Qin, Yong Xu, Haidong Zhang, Huamin Qu problem. IEEE Transactions on Visualization and Computer Graphics, 2019 • Multi-dimensionality : Temporal patterns, Relation between • Visualization of anomalies: metrics Intuitiveness, interaction. • Research Contribution: • Boundary normal/abnormal Detection system, Visualization, Evaluation Amirhossein Abbasi Nov 2019 � 1 � 2 � 3 � 4 System Overview Algorithm Flow Mathematics! What is abnormal and what is not? How to detect? � 5 � 6 � 7 � 8 Algorithm Flow Design Tasks Encoding Protocol Only utilizes the most recent data Global Categorical Colors : performance • Overview of anomalies for data query T1 metrics (CPU Frequency, Memory Usage,…) Visualization • Ranking suspicious nodes dynamically T2 Linear Color Scheme : Anomaly score • Browse data flexibly T3 Diverging Color Scheme : Difference • Facilitate anomaly detection of performance metrics to average T4 Anomalies have patterns • Similarities of nodes T5 � 9 � 10 � 11 � 12 Spatial and Temporal Views Horizon Chart Spatial and Temporal Views Rank and Performance View Interactions: • Brushing Overview T1 • Collapsing T2 Ranking T3 Browse data • Stretching T4 Facilitate detection T5 Comparison � 13 � 14 � 15 � 16

  2. Line mode PCA mode Alternative Designs Rank and Performance View Visual Clutter Consumes Space Measurement • Each line for one metric Unit • Project a multivariate data to a Overview T1 Showing one-dimensional time-series data • More conventional Showing T2 Ranking Trends Trends • Major Trend T3 Browse data • Normalize data to [-1,1] T4 Facilitate detection T5 Comparison Scaling in time domain Analyzing Overal Changes Non-linear time scaling � 17 � 18 � 19 � 20 O ffi cial Video (1:05 min) The Cluster View The Cluster View What-Why-How Summary • Dimensionality reduction. • Another perspective for anomaly diagnosis What Why How White contour: most probably anomaly - Overview T1 Gray contour:normal. - T2 Ranking •Colors and brightness Multi-variate time- T3 Browse data •Anomaly Ranking series quantitative •horizontal and line chart T4 Facilitate detection performance data •Anomaly inspection T5 Comparison from compute •Special glyphs •Anomaly Clustering nodes. •Spatial positions of nodes and charts. •Interactivity: Scrolling, Brushing, and setting parameters � 21 � 22 � 23 � 24 What-Why-How Summary What-Why-How Summary Scalability What Why How What Why How What Why How Evaluation •Colors and brightness •Colors and brightness •Colors and brightness Multi-variate time- Multi-variate time- Multi-variate time- •Anomaly Ranking •Anomaly Ranking •Anomaly Ranking series quantitative series quantitative series quantitative •horizontal and line chart •horizontal and line chart •horizontal and line chart performance data •Anomaly inspection performance data •Anomaly inspection performance data •Anomaly inspection from compute •Special glyphs from compute from compute •Special glyph •Special glyphs •Anomaly Clustering •Anomaly Clustering •Anomaly Clustering nodes. nodes. nodes. •Spatial positions of nodes •Spatial positions of nodes •Spatial positions of nodes and charts. and charts. and charts. Very Scalable: scale linearly •Interactivity: Scrolling, •Interactivity: Scrolling, with time-series input data Brushing, and setting •Interactivity: Scrolling, Scale Brushing, and setting parameters parameters Brushing, and Querying size � 25 � 26 � 27 � 28 Quantitative Evaluation Case Studies User Feedback Case Study 1: Bitbrains Datacenter Traces 500 VMs, One month •Automated Anomaly Detection: Trust in algorithm, Critique Case Study 2: Live Cloud System Data •System: Useful and User-friendly , Consistent , too comprehensive and Overwhelming, Need Tutorial 1,000,000 nodes, Two weeks •Visualization and Interaction: Helpful , new perspective for [100 data centers with 20 data overall trend, clear comparison , Confess that they use clusters with 500 nodes each] chaotic line charts before. � 29 � 30 � 31 � 32

  3. Positive Negative • Better to use non-diverging colors for horizon charts. • Alternative designs • Minor occlusion in the clustering. Question? • Make use of global colors in horizontal chart. • Super-scalable • Bad way for Assigning the ranks to performance. • Perfect evaluation • Empty clusters in spatial overview. • Very Accurate • Limitation: Just consider recent data and one metric. • Special Glyphs • Limitation: Don’t discuss why using those performance metrics for anomaly. � 33 � 34 � 35

Recommend


More recommend