Visualization of Performance Anomalies with Kieker Bachelor’s Thesis Sören Henning September 8, 2016 Sören Henning Visualization of Performance Anomalies September 8, 2016 1 / 41
Outline 1. Introduction 2. Foundations 3. Approach 4. Evaluation 5. Conclusion and Future Work Sören Henning Visualization of Performance Anomalies September 8, 2016 2 / 41
Motivation Introduction ◮ Performance is a significant quality characteristic ◮ e.g., Amazon: 100 ms delay → 1% decrease in sales (Huang 2011) ◮ e.g., Google: 500 ms delay → 20% drop in traffic (Huang 2011) Sören Henning Visualization of Performance Anomalies September 8, 2016 3 / 41
Motivation Introduction ◮ Performance is a significant quality characteristic ◮ e.g., Amazon: 100 ms delay → 1% decrease in sales (Huang 2011) ◮ e.g., Google: 500 ms delay → 20% drop in traffic (Huang 2011) ◮ Detect performance issues to react on them ◮ As soon as possible ◮ Use monitoring (e.g., measure execution times) ◮ Investigate these measurements for anomalies Sören Henning Visualization of Performance Anomalies September 8, 2016 3 / 41
Motivation Introduction ◮ Performance is a significant quality characteristic ◮ e.g., Amazon: 100 ms delay → 1% decrease in sales (Huang 2011) ◮ e.g., Google: 500 ms delay → 20% drop in traffic (Huang 2011) ◮ Detect performance issues to react on them ◮ As soon as possible ◮ Use monitoring (e.g., measure execution times) ◮ Investigate these measurements for anomalies ◮ Visualization can support interpretation of anomalies Sören Henning Visualization of Performance Anomalies September 8, 2016 3 / 41
Θ PAD’s Anomaly Detection Approach Introduction Θ PAD and Θ PADx ◮ Provide anomaly detection ◮ Part of Kieker ◮ Only R algorithms ◮ Problematic anomaly score ◮ No visualization ◮ More on this later Sören Henning Visualization of Performance Anomalies September 8, 2016 4 / 41
Thesis’ Goal Introduction Development of an approach to detect performance anomalies with Kieker and visualize them Sören Henning Visualization of Performance Anomalies September 8, 2016 5 / 41
Goals Introduction ◮ G1: Migrate Θ PAD to a TeeTime configuration Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Goals Introduction ◮ G1: Migrate Θ PAD to a TeeTime configuration ◮ G2: Implement multiple forecast algorithms Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Goals Introduction ◮ G1: Migrate Θ PAD to a TeeTime configuration ◮ G2: Implement multiple forecast algorithms ◮ G3: Provide a visualization of measured time series and detected anomalies Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Goals Introduction ◮ G1: Migrate Θ PAD to a TeeTime configuration ◮ G2: Implement multiple forecast algorithms ◮ G3: Provide a visualization of measured time series and detected anomalies ◮ G4: Evaluate the Implementation ◮ G4.1. Feasibility evaluation ◮ G4.2. Scalability evaluation Sören Henning Visualization of Performance Anomalies September 8, 2016 6 / 41
Outline Foundations 1. Introduction 2. Foundations 3. Approach 4. Evaluation 5. Conclusion and Future Work Sören Henning Visualization of Performance Anomalies September 8, 2016 7 / 41
Foundations Foundations ◮ Performance Metrics (Koziolek 2008) ◮ Time behavior and resource efficiency ◮ Time Series (Mitsa 2010) ◮ Sequence of measurements at regular temporal intervals ◮ Anomaly Detection (Chandola, Banerjee, and Kumar 2009) ◮ Anomaly: Abnormal data patterns ◮ Detection: Compare measured values with reference model Sören Henning Visualization of Performance Anomalies September 8, 2016 8 / 41
Θ PAD’s Anomaly Detection Approach Time Series Normalization Foundations 8 7 raw 6 6 5 5 measurements 4 4 3 3 3 3 1 1 1 1 aggr mean 7 4 4 3 3 normalized 1 time series 1 2 3 4 5 6 Figure: Based on Bielefeld (2012) and Frotscher (2013) Sören Henning Visualization of Performance Anomalies September 8, 2016 9 / 41
Θ PAD’s Anomaly Detection Approach Time Series Forecasting Foundations forecasting mean 7 4 4 3 3 3 1 1 2 3 4 5 6 7 sliding window Figure: Based on Bielefeld (2012) and Frotscher (2013) Sören Henning Visualization of Performance Anomalies September 8, 2016 10 / 41
Θ PAD’s Anomaly Detection Approach Anomaly Score Calculation Foundations threshold t = 0.2 6 Anomaly anomaly score ! calculation 3 anomaly decision A(a,p) = 0.33 p a 0.33 ≥ 0.2 7 Sören Henning Visualization of Performance Anomalies September 8, 2016 11 / 41
Technologies Foundations ◮ Microservices Architectural Pattern (Wolff 2015) ◮ Kieker Monitoring Framework (Hoorn et al. 2009) ◮ TeeTime Pipe and Filter Framework (Wulf, Ehmke, and Hasselbring 2014) ◮ And further technologies (see next slides) Sören Henning Visualization of Performance Anomalies September 8, 2016 12 / 41
Outline Approach 1. Introduction 2. Foundations 3. Approach 4. Evaluation 5. Conclusion and Future Work Sören Henning Visualization of Performance Anomalies September 8, 2016 13 / 41
Graphical Overview of our Approach Approach Application Anomaly Anomaly Monitoring Detection Visualization measurements time series Sören Henning Visualization of Performance Anomalies September 8, 2016 14 / 41
Architecture of our Approach Approach Visualization Client Visualization Analysis Provider Server R-based Database Forecast Sören Henning Visualization of Performance Anomalies September 8, 2016 15 / 41
Architecture of our Implementation Approach Client Containerized with Server Sören Henning Visualization of Performance Anomalies September 8, 2016 16 / 41
Performance Anomaly Detection TeeTime Configuration Approach One analysis branch per Filter Record Anomaly Detector Converter Flow Record Record Record TCP Reader Filter Reconstructor Distributor Record Anomaly Detector Converter Distribute by Filter ... Filter: ◮ Operation signature ◮ Class signature ◮ Host name ◮ ... Sören Henning Visualization of Performance Anomalies September 8, 2016 17 / 41
Performance Anomaly Detection TeeTime Configuration Approach Record Anomaly Detector Converter Flow Record Record Record TCP Reader Filter Reconstructor Distributor Record Anomaly Detector Converter Filter ... Distributor Filter Sören Henning Visualization of Performance Anomalies September 8, 2016 17 / 41
Performance Anomaly Detection TeeTime Configuration Approach Record Anomaly Detector Converter Flow Record Record Record TCP Reader Filter Reconstructor Distributor Record Anomaly Detector Converter ... Sören Henning Visualization of Performance Anomalies September 8, 2016 17 / 41
Time Series Analysis and Anomaly Detection TeeTime Configuration Approach Database Adapter Interface Anomaly Detection Stage on startup Sliding Window In Memory Storager Time Series Normalizer Forecaster Loader Measurement Anomaly Threshold Distributor Forecast Score Distributor Threshold Threshold Filter Decorator Calculator Filter Filter Sören Henning Visualization of Performance Anomalies September 8, 2016 18 / 41
Demo of Visualization Approach Sören Henning Visualization of Performance Anomalies September 8, 2016 19 / 41
Architecture of Visualization Approach request every x sec. Anomaliz.js CanvasPlot Data Updating DB update update send ● chart plotting ● chart controlling new data ● interactive zooming ● value management UI Handling ● automatic scrolling Server Client (Web Browser) ◮ Usage of Arne Johanson’s CanvasPlot (Johanson 2016) Sören Henning Visualization of Performance Anomalies September 8, 2016 20 / 41
Outline Evaluation 1. Introduction 2. Foundations 3. Approach 4. Evaluation 5. Conclusion and Future Work Sören Henning Visualization of Performance Anomalies September 8, 2016 21 / 41
Feasibility Evaluation Scenarios Evaluation 500 200 Response Time in ms Response Time in ms 400 300 200 100 100 0 10000 20000 30000 40000 0 10000 20000 30000 40000 Time in ms Time in ms 300 500 Response Time in ms Response Time in ms 400 200 300 200 100 100 0 10000 20000 30000 40000 0 10000 20000 30000 40000 Time in ms Time in ms Sören Henning Visualization of Performance Anomalies September 8, 2016 22 / 41
Feasibility Evaluation Scenario: Seasonal with Anomaly Evaluation 300 Response Time in ms 200 100 0 10000 20000 30000 40000 Time in ms 0.6 0.4 Anomaly Score 0.2 0.0 −0.2 −0.4 −0.6 0 10000 20000 30000 40000 Time in ms ARIMAForecaster LinearWeightedForecaster MeanForecaster ExponentialWeightedForecaster LogarithmicWeightedForecaster RegressionForecaster Sören Henning Visualization of Performance Anomalies September 8, 2016 23 / 41
Scalability Evaluation Configuration Evaluation ◮ Take time for record processing in analysis Sören Henning Visualization of Performance Anomalies September 8, 2016 24 / 41
Scalability Evaluation Configuration Evaluation ◮ Take time for record processing in analysis ◮ Evaluate: Execution time ≤ measurement frequency ? Sören Henning Visualization of Performance Anomalies September 8, 2016 24 / 41
Recommend
More recommend