Context & Motivations Motivations: SLA Violations On simple (easy to detect) Key Performance Indicator: Downtime Cloud Vendors Availability (%) Downtime (H) Avg.Downtime (H) Cost ($/H) Downtime cost ($) YouTube 99.999 0.17 0.024 200 k 34 k Cisco 99.97 5.33 0.761 200 k 1066 k Facebook 99.951 8.5 1.214 200 k 1700 k VMware 99.943 10 1.429 336 k 3360 k Dropbox 99.903 17 2.429 200 k 3400 k Twitter 99.871 22.68 3.24 200 k 4536 k Netflix 99.863 24 3.429 200 k 4800 k Google 99.661 59.31 8.473 300 k 17739 k Apple 99.583 73.05 10.436 200 k 14610 k Yahoo 99.475 92 13.143 200 k 18400 k SalesForce 99.32 119.08 17.012 200 k 23816 k OVH 98.963 181.63 25.947 336 k 61027 k IBM 98.727 223 31.857 336 k 74928 k Amazon 98.382 292.893 41.841 336 k 98411 k Microsoft Azure 97.811 383.54 54.791 336 k 128869 k [IWGCCR13] C. Cerin & al. Downtime statistics of current cloud solutions. IWGCCR, 2013. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 10 / 65 �
Context & Motivations Motivations: SLA Violations On simple (easy to detect) Key Performance Indicator: Downtime Cloud Vendors Availability (%) Downtime (H) Avg.Downtime (H) Cost ($/H) Downtime cost ($) YouTube 99.999 0.17 0.024 200 k 34 k Cisco 99.97 5.33 0.761 200 k 1066 k Facebook 99.951 8.5 1.214 200 k 1700 k VMware 99.943 10 1.429 336 k 3360 k Dropbox 99.903 17 2.429 200 k 3400 k Twitter 99.871 22.68 3.24 200 k 4536 k Netflix 99.863 24 3.429 200 k 4800 k Google 99.661 59.31 8.473 300 k 17739 k Apple 99.583 73.05 10.436 200 k 14610 k Yahoo 99.475 92 13.143 200 k 18400 k SalesForce 99.32 119.08 17.012 200 k 23816 k OVH 98.963 181.63 25.947 336 k 61027 k IBM 98.727 223 31.857 336 k 74928 k Amazon 98.382 292.893 41.841 336 k 98411 k Microsoft Azure 97.811 383.54 54.791 336 k 128869 k [IWGCCR13] C. Cerin & al. Downtime statistics of current cloud solutions. IWGCCR, 2013. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 10 / 65 �
Context & Motivations SLA Violations: SaaS [Google19] Google. G Suite Status Dashboard. 2019. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 11 / 65 �
Context & Motivations SLA Violations: SaaS [Google19] Google. G Suite Status Dashboard. 2019. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 11 / 65 �
Context & Motivations Research Motivations State-of-the-Art: SaaS Performance Evaluation Relatively few research work done: ֒ → mostly focus on quality of software services [SP18] ֒ → quality models : rough sets, MCDA, prediction and fuzzy logic, Mean Opinion Score ֒ → different attributes of software quality: � functionality, reliability, usability, efficiency, maintainability, and portability [ISO01] → OR report easy to detect KPIs ( Ex : downtime) ֒ [SP18] D. Jagli & .al, A quality model for evaluating saas on the cloud computing environment, Springer, 2018 [ISO01] ISO/IEC 9126-1:2001 Software engineering-Product quality-Part 1: Quality model Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 12 / 65 �
Context & Motivations Research Motivations State-of-the-Art: SaaS Performance Evaluation Relatively few research work done: ֒ → mostly focus on quality of software services [SP18] ֒ → quality models : rough sets, MCDA, prediction and fuzzy logic, Mean Opinion Score ֒ → different attributes of software quality: � functionality, reliability, usability, efficiency, maintainability, and portability [ISO01] → OR report easy to detect KPIs ( Ex : downtime) ֒ ⇒ no actual automated way to check QoS [SP18] D. Jagli & .al, A quality model for evaluating saas on the cloud computing environment, Springer, 2018 [ISO01] ISO/IEC 9126-1:2001 Software engineering-Product quality-Part 1: Quality model Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 12 / 65 �
Context & Motivations Research Motivations State-of-the-Art: SLA Assurance There are few works in SLA metrics monitoring/measurement: ֒ → Based on Black-box metrics evaluation [CloudCom17] : � CloudHarmony, Monitis, CloudWatch, CloudStatus, . . . � Test-as-a-Service (TaaS) on the cloud � other frameworks, CLOUDQUAL [TI14] [CloudCom17] S. Wagle & .al, Service performance pattern analysis and prediction of available providers, 2017 [TI14] X. Zheng & .al, Cloudqual: A quality model for cloud services, IEEE Trans. on Informatics, 2014 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 13 / 65 �
Context & Motivations Research Motivations State-of-the-Art: SLA Assurance There are few works in SLA metrics monitoring/measurement: ֒ → Based on Black-box metrics evaluation [CloudCom17] : � CloudHarmony, Monitis, CloudWatch, CloudStatus, . . . � Test-as-a-Service (TaaS) on the cloud � other frameworks, CLOUDQUAL [TI14] ⇒ no automated and standard way of measuring the SLA compliance [CloudCom17] S. Wagle & .al, Service performance pattern analysis and prediction of available providers, 2017 [TI14] X. Zheng & .al, Cloudqual: A quality model for cloud services, IEEE Trans. on Informatics, 2014 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 13 / 65 �
Context & Motivations Research Motivations Ph.D. Objectives Propose a systematic & optimized framework for evaluating : ֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs (allowing to propose a pertinent ranking between them) Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 14 / 65 �
Context & Motivations Research Motivations Ph.D. Objectives Propose a systematic & optimized framework for evaluating : ֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs (allowing to propose a pertinent ranking between them) The framework should assess SaaS services : → Pertinent benchmarking/monitoring involving multiple metrics using distributed agents ֒ Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 14 / 65 �
Context & Motivations Research Motivations Ph.D. Objectives Propose a systematic & optimized framework for evaluating : ֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs (allowing to propose a pertinent ranking between them) The framework should assess SaaS services : → Pertinent benchmarking/monitoring involving multiple metrics using distributed agents ֒ → Automatic and stealth (i.e., obfuscated) way ֒ � prevent CSPs to improve their results (by adapting the allocated resource) upon detection of evaluation ֒ → Defeat benchmarking detection � hidden as a “normal” client behaviour Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 14 / 65 �
Context & Motivations Research Motivations Ph.D. Objectives Propose a systematic & optimized framework for evaluating : ֒ → QoS and SLA compliance of cloud SaaS services offered ֒ → Across several CSPs (allowing to propose a pertinent ranking between them) The framework should assess SaaS services : → Pertinent benchmarking/monitoring involving multiple metrics using distributed agents ֒ → Automatic and stealth (i.e., obfuscated) way ֒ � prevent CSPs to improve their results (by adapting the allocated resource) upon detection of evaluation ֒ → Defeat benchmarking detection � hidden as a “normal” client behaviour ⇒ PRESEnCE framework Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 14 / 65 �
Context & Motivations Summary of Ph.D. Contributions Cloud Computing SaaS Cloud Web SLAs CSPs Services Evaluating & SLOs / Metrics QoS Analysis Analysis KPIs Monitoring Analysis PRESE N CE Service-Levels-based Assurance & Metrics Modeling Stealth Testing Ranking Verification Probability-based Model Sensitivity Prediction Model MCDA-based Ranking for detecting Breaches Analysis for Metrics Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 15 / 65 �
Context & Motivations Summary of Ph.D. Contributions Cloud Computing SaaS Cloud Web SLAs CSPs Services Evaluating & SLOs / Metrics QoS Analysis Analysis KPIs Monitoring Analysis PRESE N CE Service-Levels-based Assurance & Metrics Modeling Stealth Testing Ranking Verification Probability-based Model Sensitivity Prediction Model MCDA-based Ranking for detecting Breaches Analysis for Metrics Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 15 / 65 �
Context & Motivations Summary of Ph.D. Contributions Cloud Computing SaaS Cloud Web SLAs CSPs Services Evaluating & SLOs / Metrics QoS Analysis Analysis KPIs Monitoring Analysis PRESE N CE Service-Levels-based Assurance & Metrics Modeling Stealth Testing Ranking Verification Probability-based Model Sensitivity Prediction Model MCDA-based Ranking for detecting Breaches Analysis for Metrics Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 15 / 65 �
Context & Motivations Summary of Ph.D. Contributions Cloud Computing SaaS Cloud Web SLAs CSPs Services Evaluating & SLOs / Metrics QoS Analysis Analysis KPIs Monitoring Analysis PRESE N CE Service-Levels-based Assurance & Metrics Modeling Stealth Testing Ranking Verification Probability-based Model Sensitivity Prediction Model MCDA-based Ranking for detecting Breaches Analysis for Metrics Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 15 / 65 �
Context & Motivations Summary of Ph.D. Contributions Cloud Computing SaaS Cloud Web SLAs CSPs Services Evaluating & SLOs / Metrics QoS Analysis Analysis KPIs Monitoring Analysis PRESE N CE Service-Levels-based Assurance & Metrics Modeling Stealth Testing Ranking Verification Probability-based Model Sensitivity Prediction Model MCDA-based Ranking for detecting Breaches Analysis for Metrics Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 15 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Summary 1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 16 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Proposed Framework PRESEnCE Framework Objective Evaluate the QoS and SLA compliance of Web Services offered ֒ → And across several Cloud Service Providers (CSPs). Methodology Quantify in a fair & stealth way the SaaS WS performance → including scalability of the delivered Web Services. ֒ Assess the claimed SLA and the corresponding QoS ֒ → using a set of relevant performance metrics (response time). Provide a multi-objective analysis of the gathered performance metrics ֒ → to be able to classify cloud brokers. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 17 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling module Stealth module SLA checker module WS Performance predictive monitoring dynamic load adaptation virtual QoS aggregator Evaluation Agent / metric 1 Agent / metric 2 Agent / metric k [Distributed] PRESEnCE Client c’ (Auditor) Client c A1 Client c B1 Cloud Provider 1 Cloud Provider n Client c A2 Web Service A Web Service A Client c B2 Example: Redis, Memcached, MongoDB, PostgreSQL etc. Client c An Client c Bm Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 18 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling module Stealth module SLA checker module WS Performance predictive monitoring dynamic load adaptation virtual QoS aggregator Evaluation Agent / metric 1 Agent / metric 2 Agent / metric k [Distributed] PRESEnCE Client c’ (Auditor) Client c A1 Client c B1 Cloud Provider 1 Cloud Provider n Client c A2 Web Service A Web Service A Client c B2 Example: Redis, Memcached, MongoDB, PostgreSQL etc. Client c An Client c Bm Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 18 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling module Stealth module SLA checker module WS Performance predictive monitoring dynamic load adaptation virtual QoS aggregator Evaluation Agent / metric 1 Agent / metric 2 Agent / metric k [Distributed] PRESEnCE Client c’ (Auditor) Client c A1 Client c B1 Cloud Provider 1 Cloud Provider n Client c A2 Web Service A Web Service A Client c B2 Example: Redis, Memcached, MongoDB, PostgreSQL etc. Client c An Client c Bm Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 18 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling module Stealth module SLA checker module WS Performance predictive monitoring dynamic load adaptation virtual QoS aggregator Evaluation Agent / metric 1 Agent / metric 2 Agent / metric k [Distributed] PRESEnCE Client c’ (Auditor) Client c A1 Client c B1 Cloud Provider 1 Cloud Provider n Client c A2 Web Service A Web Service A Client c B2 Example: Redis, Memcached, MongoDB, PostgreSQL etc. Client c An Client c Bm Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 18 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling module Stealth module SLA checker module WS Performance predictive monitoring dynamic load adaptation virtual QoS aggregator Evaluation Agent / metric 1 Agent / metric 2 Agent / metric k [Distributed] PRESEnCE Client c’ (Auditor) Client c A1 Client c B1 Cloud Provider 1 Cloud Provider n Client c A2 Web Service A Web Service A Client c B2 Example: Redis, Memcached, MongoDB, PostgreSQL etc. Client c An Client c Bm Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 18 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation virtual QoS aggregator Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 19 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation virtual QoS aggregator Modeling Module ֒ → monitoring & modeling the Cloud services performance metrics Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 19 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation virtual QoS aggregator Modeling Module ֒ → monitoring & modeling the Cloud services performance metrics Stealth Module ֒ → providing obfuscated and optimized benchmarking scenarios Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 19 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Framework On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation virtual QoS aggregator Modeling Module ֒ → monitoring & modeling the Cloud services performance metrics Stealth Module ֒ → providing obfuscated and optimized benchmarking scenarios SLA checker Module → assessing & assuring SLA metrics ֒ Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 19 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Summary 1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 20 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Modeling Module On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Modeling Module Stealth Module SLA checker Module monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator PRESEnCE modeling module objectives → Analysis of SaaS Metrics ֒ → Evaluating & monitoring SaaS Web Services ֒ → Collecting Data for the Metrics ֒ → Modeling the performance metrics ֒ Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 21 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Modeling Module Stealth Module SLA checker Module Cloud Critical KPIs monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator KPIs Metrics Availability Response time, Up time, Down time Scalability Avg. assigned resources, Avg. number of users, Capacity Reliability Accuracy of Service, Fault Tolerance, Maturity Efficiency Utilization of Resource, Ratio of waiting time Reusability Readability, Publicity, Coverage of variability Composability Service Modularity, Service interoperability Adaptability Completeness of Variant Set, Coverage of Variability Usability Operability, Attractiveness, Learnability Elasticity Suspend Time, Delete Time, Provision Time Network and Packet Loss Frequency, Connection Error Rate, Communication Throughput, Latency Security Security Standards, Data Integrity, Sensitivity, Confidentiality Cost Total Cost, FLOP Cost (cent /FLOP, GFLOP ) [CC12] S. Sinung & al. Performance measurement of cloud computing services, 2012. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 22 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Modeling Module Stealth Module SLA checker Module Cloud Critical KPIs monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator KPIs Metrics Availability Response time, Up time, Down time Scalability Avg. assigned resources, Avg. number of users, Capacity Reliability Accuracy of Service, Fault Tolerance, Maturity Efficiency Utilization of Resource, Ratio of waiting time Reusability Readability, Publicity, Coverage of variability Composability Service Modularity, Service interoperability Adaptability Completeness of Variant Set, Coverage of Variability Usability Operability, Attractiveness, Learnability Elasticity Suspend Time, Delete Time, Provision Time Network and Packet Loss Frequency, Connection Error Rate, Communication Throughput, Latency Security Security Standards, Data Integrity, Sensitivity, Confidentiality Cost Total Cost, FLOP Cost (cent /FLOP, GFLOP ) [CC12] S. Sinung & al. Performance measurement of cloud computing services, 2012. [TR10] Guiding Metrics, The cloud service industry’s 10 most critical metrics, 2019. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 22 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Modeling Module Stealth Module SLA checker Module PRESEnCE: Agents monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator Modeling module Stealth module SLA checker module WS Performance predictive monitoring dynamic load adaptation virtual QoS aggregator Evaluation Agent / metric 1 Agent / metric 2 Agent / metric k Benchmark Tool Version Targeted SaaS Web Services YCSB 0.12.0 Redis, MongoDB, Memcached, DynamoDB, ..etc Memtire-Bench 1.2.8 Redis, Memcached Redis-Bench 2.4.2 Redis Twitter RPC-Perf 2.0.3-pre Redis, Memcached, Apache PgBench 9.4.12 Postgresql, MySQl, SQLServer, Oracle DB Apache AB 2.3 Apache, Nginx, Jexus HTTP Load 1 Apache, Nginx, Jexus Iperf v1, v3 Iperf Server [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 23 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB Memtire-Bench Benchmark B i Redis-Bench Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench Benchmark B i Redis-Bench Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf � � � � PgBench Apache AB HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf � � � � PgBench � � Apache AB HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf � � � � PgBench � � Apache AB � � � HTTP Load Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf � � � � PgBench � � Apache AB � � � HTTP Load � � � Iperf Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Parameters Throughput Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf � � � � PgBench � � Apache AB � � � HTTP Load � � � Iperf � � Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i INPUTs B i OUTPUTs (Measured Metrics) CleanUp Latency #Parallel Clients Update Latency Response Time #Transactions Workload Size Read Latency Transfer Rate #Operations Parameters Throughput #Requests #Threads #Records #Fetches #Pipes Latency Miss Hits YCSB � � � � � � Memtire-Bench � � � � � Benchmark B i Redis-Bench � Twitter RPCPerf � � � � PgBench � � Apache AB � � � HTTP Load � � � Iperf � � Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Identified Performance Metrics B i INPUTs B i OUTPUTs (Measured Metrics) CleanUp Latency #Parallel Clients Update Latency Response Time #Transactions Workload Size Read Latency Transfer Rate #Operations Parameters Throughput #Requests #Threads #Records #Fetches #Pipes Latency Miss Hits YCSB � � � � � � � � � � Memtire-Bench � � � � � � � � � Benchmark B i Redis-Bench � � � � � Twitter RPCPerf � � � � � � PgBench � � � � � Apache AB � � � � � HTTP Load � � � � � Iperf � � � Performance Metrics Coverage: [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 24 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Modeling Module Stealth Module SLA checker Module Deployed Web Services monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator SaaS Services Type Version Used by GitHub, Twitter, Redis NoSQL Database 2.8.17 Pinterest Google, Facebook, MongoDB NoSQL Database 3.4 Cisco, ebay, Uber Amazone, Netflix, Memcached NoSQL Database 1.5.0 Instagram, Slack, Dropbox Nokia, BMW, Netflix, PostrgreSQL SQL Database 9.4 Skybe, Apple Linkedin, Slack, Apache HTTP 2.2.22.13 Accenture Iperf server Network V1, V3 – [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 25 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Modeling Module Stealth Module SLA checker Module PRESEnCE Monitoring monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator CSC 1 Normal Trace Normal Trace (Workload) CSC 2 CSP 1 CSC n Deployed Cloud Services Services Metrics Monitoring Evaluations PRESENCE Benchmarking Scenarios Agents CSP n PRESENCE Auditor Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 26 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Monitoring Results Number of Records 0 2000 4000 6000 8000 10000 10000 Servers(Throughput): Redis MongoDB Memcached 8000 Throughput(ops/sec) 6000 4000 2000 0 0 2000 4000 6000 8000 10000 Number of Operations [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 27 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Monitoring Results Number of Records Number of Records 0 2000 4000 6000 8000 10000 0 2000 4000 6000 8000 10000 50000 Servers(Update Latency): Servers(Read Latency): Redis Redis MongoDB MongoDB 40000 Memcached 60000 Memcached Update Latency(us) 30000 Read Latency(us) 40000 20000 20000 10000 0 0 0 2000 4000 6000 8000 10000 0 2000 4000 6000 8000 10000 Number of Operations Number of Operations [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 27 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Monitoring Results Number of Parallel Clients Number of Fetches 120 120 1.0 1.0 30000 120000 190000 300 650 1000 HTTP LOAD: Latency (Avg (msec)) 100 100 Connection Latency (AVG (ms)) 0.8 0.8 Normalized Throughput (Fetches/sec) Average Latency Connection (msec) Average Latency (msec) 80 80 Normalized Latency 0.6 0.6 60 60 0.4 0.4 40 40 0.2 0.2 20 20 HTTP LOAD Throughput 0.0 0.0 Latency 0 50000 100000 150000 200000 200 400 600 800 1000 Number of Fetches Number of Parallel Clients [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 27 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Monitoring Results Number of Parallel Clients 1.0 1.0 20 50 80 Pgbench TPS Response Time 0.8 0.8 Normalized Response Time (Latency) Response Time Normalized TPS 0.6 0.6 0.4 0.4 0.2 0.2 0.0 0.0 0 2000 4000 6000 8000 10000 Number of Transactions per Client [MIS18] A. Ibrahim & al. Monitoring and modelling the performance metrics of mobile cloud saas web services. Mobile Information Systems, 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 27 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Performance Modeling CSC 1 Normal Trace (Workload) Normal Trace CSC 2 CSP 1 CSC n PRESENCE Deployed Agents Metrics Services Cloud Services Evaluations Benchmarking Collecting Scenarios Data Monitoring CSP n Modeling PRESENCE Auditor Arena: Input Analyser Appropriate Distribution Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 28 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Performance Modeling CSC 1 Normal Trace (Workload) Normal Trace CSC 2 CSP 1 CSC n PRESENCE Deployed Agents Metrics Services Cloud Services Evaluations Benchmarking Collecting Scenarios Data Monitoring CSP n Modeling PRESENCE Auditor Arena: Input Analyser Appropriate Distribution Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 28 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Models Validation Necessity to validate the PRESEnCE models generated from: → The monitoring data from PRESEnCE agents ֒ → Generated data from the obtained models ֒ DATA SETS NORMALITY TEST (Kologorov-Smiron test) True False NORMAL VARIABLES NON-NORMAL VARIABLES (mean comparisons, parametric tests) (median comparisons, non-parametric tests) 2 data > 2 data 2 data > 2 data Student ANOVA Wilcoxon test Friedman test t -test (Analysis of Variance) [ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 29 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Models Validation Necessity to validate the PRESEnCE models generated from: → The monitoring data from PRESEnCE agents ֒ → Generated data from the obtained models ֒ DATA SETS NORMALITY TEST (Kologorov-Smiron test) True False NORMAL VARIABLES NON-NORMAL VARIABLES (mean comparisons, parametric tests) (median comparisons, non-parametric tests) 2 data > 2 data 2 data > 2 data Student ANOVA Wilcoxon test Friedman test t -test (Analysis of Variance) [ITOR13] E. Alba & al. Parallel meta-heuristics: recent advances and new trends. ITOR 2013 Statistically significant : Confidence level > 95% , ( p-value < 0.05) Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 29 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Modeling Results Ex : Redis SaaS Web service Metric Distribution Model Expression x β − 1 (1 − x ) α − 1 for 0 < x < 1 − 0 . 001 + 1 ∗ BETA (3 . 63 , 3 . 09) B ( β,α ) f ( x ) = where 0 otherwise Throughput Beta BETA ( β, α ) where β is the complete beta function given by β = 3 . 63 B ( β, α ) = � 1 α = 3 . 09 0 t β − 1 (1 − t ) α − 1 dt Offset = − 0 . 001 − x β − α x α − 1 e β − 0 . 001 + GAMM (0 . 0846 , 2 . 39) for x > 0 Γ( α ) f ( x ) = where 0 otherwise Latency Read Gamma GAMM ( β, α ) where Γ is the complete gamma function given by β = 0 . 0846 Γ( α ) = � inf α = 2 . 39 t α − 1 e − 1 dt Offset = − 0 . 001 0 − 0 . 001 + ERLA (0 . 0733 , 3) − x β − k x k − 1 e β for x > 0 where ( k − 1)! Latency Update Erlang f ( x ) = ERLA ( β, k ) 0 otherwise k = 3 β = 0 . 0733 Offset = − 0 . 001 [CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 30 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Modeling Results Ex : MongoDB SaaS Web service Metric Distribution Model Expression x β − 1 (1 − x ) α − 1 for 0 < x < 1 − 0 . 001 + 1 ∗ BETA (3 . 65 , 2 . 11) B ( β,α ) f ( x ) = where 0 otherwise Throughput Beta BETA ( β, α ) where β is the complete beta function given by β = 3 . 65 B ( β, α ) = � 1 α = 2 . 11 0 t β − 1 (1 − t ) α − 1 dt Offset = − 0 . 001 x β − 1 (1 − x ) α − 1 − 0 . 001 + 1 ∗ BETA (1 . 6 , 2 . 48) for 0 < x < 1 B ( β,α ) f ( x ) = where 0 otherwise Latency Read Beta BETA ( β, α ) where β is the complete beta function given by β = 1 . 6 B ( β, α ) = � 1 α = 2 . 48 0 t β − 1 (1 − t ) α − 1 dt Offset = − 0 . 001 − 0 . 001 + ERLA (0 . 0902 , 2) − x β − k x k − 1 e β for x > 0 where ( k − 1)! Latency Update Erlang f ( x ) = ERLA ( β, k ) 0 otherwise k = 2 β = 0 . 0902 Offset = − 0 . 001 [CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 30 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE WS Modeling Results B i OUTPUTs (Measured Metrics) CleanUp Latency Update Latency Response Time Read Latency Transfer Rate Throughput Latency Miss Hits SaaS WS Performance Models Summary 19 models were generated : ֒ → represent the performance metrics for the SaaS Web Service 15 out of 19 models are proved accurate ֒ → i.e., 78.9% of the analyzed models have Confidence level > 95% [CLOUD18] A. Ibrahim & al. Performance metrics models for cloud SaaS web services, IEEE CLOUD 2018. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 31 / 65 �
Analysis Evaluate & Collecting Data for Generate Performance Metrics Monitoring Metrics the Metrics Distribution Models PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Summary On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Modeling Module Stealth Module SLA checker Module monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 32 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Summary On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Modeling Module Stealth Module SLA checker Module monitoring/modeling monitoring/modeling dynamic load adaptation virtual QoS aggregator Analysis Evaluate & Collecting Data for Generate Performance Metrics Monitoring Metrics the Metrics Distribution Models Agent / metric 1 Agent / metric 2 Agent / metric k Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 32 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Summary 1 Context & Motivations 2 PRESEnCE: PeRformance Evaluation of SErvices on the Cloud Modeling Module Stealth Module SLA Checker Module 3 Conclusion and Perspectives Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 33 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Stealth Module On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator PRESEnCE stealth module objectives Provide benchmark scenarios which ensure : ֒ → accurate and stealth (i.e., obfuscated) testing � CSP should not adapt the allocated resource. Ex : to improve evaluation results Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 34 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Stealth Module On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator PRESEnCE stealth module objectives Provide benchmark scenarios which ensure : ֒ → accurate and stealth (i.e., obfuscated) testing � CSP should not adapt the allocated resource. Ex : to improve evaluation results ֒ → defeating potential benchmarking detection � hidden as a “normal” client behaviour Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 34 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE: Stealth Module On-demand evaluation of SaaS Web Services Validator SLA/QoS across Multi-Cloud Providers PRESEnCE based on: Workload / SLA analysis Analyze Predictive Ranking analytics Performance Evaluation Modeling Module Stealth Module Stealth Module SLA checker Module monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator PRESEnCE stealth module objectives Provide benchmark scenarios which ensure : ֒ → accurate and stealth (i.e., obfuscated) testing � CSP should not adapt the allocated resource. Ex : to improve evaluation results ֒ → defeating potential benchmarking detection � hidden as a “normal” client behaviour → Exploiting PRESEnCE models previously generated ֒ Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 34 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module Stealth Module Overview monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator Normal Trace (Workload) CSC 1 CSC 2 Stealth Module CSC n PRESENCE Expected Normal Agents Metrics Trace Evaluations CSP 1 Testing Model (1) ORACLE Benchmarking Scenarios Deployed Services Cloud Services Monitoring Testing Model (2) Not Stealth, you Collecting Modeling Data cannot Test Testing Scenario Trace CSP n Testing Scenario Distinguishable Under Evaluation PRESENCE NO Auditor Appropriate Arena: Input Analyser Distribution Testing Model (n) Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 35 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module Stealth Module Overview monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator Normal Trace (Workload) CSC 1 CSC 2 Stealth Module CSC n PRESENCE Expected Normal Agents Metrics Trace Evaluations CSP 1 Testing Model (1) ORACLE Benchmarking Scenarios Deployed Services Cloud Services Monitoring YES Testing Model (2) Collecting Modeling It's stealth, Data Testing Scenario you can Test Trace CSP n Under Evaluation PRESENCE Auditor Appropriate Arena: Input Analyser Distribution Testing Model (n) Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 35 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module The Stealth Problem monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator Given an [estimated] aggregated SaaS customer behaviour → Find the best benchmarking scenario matching this behaviour ֒ � time-sequence of carefully selected benchmarks � adaptation/optimisation of input parameters Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 36 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module The Stealth Problem monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator Given an [estimated] aggregated SaaS customer behaviour → Find the best benchmarking scenario matching this behaviour ֒ � time-sequence of carefully selected benchmarks � adaptation/optimisation of input parameters Ex : A possible solution (benchmarking scenario) → based on benchmarks models ˆ ֒ B i , for time period [ T 0 = 5 , T end = 340] Time start Time end Benchmark Inputs B i or ˆ parameters t start t end B i 5 120 Bench 1 X 1 45 220 Bench 2 X 2 130 280 Bench 3 X 3 190 340 Bench 4 X 4 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 36 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Illustration Termination Time 100 150 200 250 300 350 Normal Trace Testing Trace 6000 Bench 2 {X 2 } Bench 3 {X 3 } Bench 4 {X 4 } 5000 Throughput 4000 Bench 1 {X 1 } 3000 2000 0 50 100 150 200 250 Starting Time Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 37 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Illustration For a given estimated benchmark ˆ B i → Find optimized input parameters X ∗ i minimizing RSS distance ֒ � Obj : defeat ORACLE detection scheme Benchmark Time start Time end Inputs B i or ˆ t start t end B i parameters 5 120 X 1 → X ∗ Bench 1 1 45 220 Bench 2 X 1 → X ∗ 2 130 280 Bench 3 X 1 → X ∗ 3 190 340 X 1 → X ∗ Bench 4 4 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 38 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Illustration Termination Time 100 150 200 250 300 350 Normal Trace Testing Trace 6000 Distance Bench 2 {X 2 } Bench 3 {X 3 } Bench 4 {X 4 } Minimization 5000 Throughput 4000 Bench 1 {X 1 } Minimizing 3000 Minimizing 2000 0 50 100 150 200 250 Starting Time Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 39 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Illustration Termination Time 100 150 200 250 300 350 Normal Trace Distance 6000 * } * } Minimization Bench 3 {X 3 Bench 4 {X 4 Optimized Testing Trace 5000 * } Bench 2 {X 2 Throughput 4000 * } Bench 1 {X 1 Minimized 3000 Minimized 2000 0 50 100 150 200 250 Starting Time Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 39 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Summary Expected Normal Usage Model Multi-layer optimisation Oracle Optimizing Benchmarking scenario → finding appropriate parameters for each benchmark ˆ Emulating the CSP View ֒ B i PRESENCE Non-Distinguishable � minimizing the RSS distance � underlying detection heuristic of the Oracle Calculate Distance RSS between Normal & Benchmarks → for each ∆ t : ֒ Traces YES PRESENCE Benchmarks � find the best estimated benchmark ˆ B i Yes ֒ → for the global time period : IF RSS < � derive an optimized benchmarking scenario Threshold � optimized sequence of benchmarks, incl. input parameters, start & end time No Optimize the NO Distance Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 40 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Resolution Solving Approaches Exact Methods Metaheuristics Scalability Approximate Best fit Scalable Issue fit [Wiley09] EG Talbi, Meta-heuristics: from design to implementation Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 41 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Stealth Problem Resolution Solving Approaches Exact Methods Metaheuristics Scalability Approximate Best fit Scalable Issue fit Proposed approach [Wiley09] EG Talbi, Meta-heuristics: from design to implementation ֒ → Genetic Algorithm (GA) ֒ → Hybrid Algorithm (GA + ML) Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 41 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module Optimisation Model monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator ∆( Y, ˆ min θ ∈ Θ max Y ) i n Stealth problem objective where ∆( Y, ˆ � � Y − ˆ � � Y ) = (1) Y � i,k For each time period and each ˆ B i min s.t. z θ ∈ Θ ֒ → Optimize set of inputs X i z ≥ � � � y 1 − ˆ y 1 (Θ) � � ֒ → Obj : defeat oracle detection Optimize benchmark set over time { ˆ B i } t z ≥ � � � � y 2 − ˆ y 2 (Θ) � → Note : Benchs may overlap ֒ z ≥ � � � � y 3 − ˆ y 3 (Θ) (2) → Yet without loss of generality : ֒ � � no overlap between Benchmarks . . . z ≥ � � � y i − ˆ y i (Θ) � � i ∈ { 1 , 2 , 3 , ..., n } (3) where ˆ y i (Θ) → Prediction Model Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 42 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Application: FIFA Web Services Deployed during one of the most popular worldwide event ֒ → squad and venue information, live matches etc. [NET] A. Martin & al. Workload Characterization of the 1998 World Cup Web Site. IEEE Network. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 43 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Application: FIFA Web Services Deployed during one of the most popular worldwide event ֒ → squad and venue information, live matches etc. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 43 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud Application: FIFA Web Services Deployed during one of the most popular worldwide event ֒ → squad and venue information, live matches etc. Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 43 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module Experimental Setup monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator Application of PRESEnCE stealth module against FIFA WS traces ֒ → comparison of the two proposed approaches (GA and Hybrid) Configurations [1, 2, 3] Configurations [4, 5, 6] Expected normal trace FIFA FIFA Number of generations 1000 10000 Population size [20, 50, 100] [20, 50, 100] Number of evaluations [50, 20, 10] [500, 200, 100] Selection process Bi-Tournament Bi-Tournament Crossover operator 2-point crossover 2-point crossover Crossover rate 0.8 0.8 Mutation operator uniform uniform Mutation rate 0.01 0.01 Number of executions 30 30 Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 44 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module Experimental Setup monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator Application of PRESEnCE stealth module against FIFA WS traces ֒ → comparison of the two proposed approaches (GA and Hybrid) Configurations [1, 2, 3] Configurations [4, 5, 6] Expected normal trace FIFA FIFA Number of generations 1000 10000 Population size [20, 50, 100] [20, 50, 100] Number of evaluations [50, 20, 10] [500, 200, 100] Selection process Bi-Tournament Bi-Tournament Crossover operator 2-point crossover 2-point crossover Crossover rate 0.8 0.8 Mutation operator uniform uniform Mutation rate 0.01 0.01 Number of executions 30 30 Performance Indicator for PRESEnCE stealth module ⇒ Convergence Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 44 / 65 �
PRESEnCE : PeRformance Evaluation of SErvices on the Cloud PRESEnCE Modeling Module Stealth Module Stealth Module SLA checker Module Results - Convergence monitoring/modeling dynamic load adaptation dynamic load adaptation virtual QoS aggregator STD 30000 StdErr 95% Confidence Interval Ev −> Evaluations 28000 Ex −> Executions Convergence (Residual) 26000 24000 22000 Ev = 10 Ev = 10 Ev = 20 Ev = 20 Ev = 50 Ev = 50 Ev = 100 Ev = 100 Ev = 200 Ev = 200 Ev = 500 Ev = 500 20000 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Ex = 30 Config 1 Config 1 Config 2 Config 2 Config 3 Config 3 Config 4 Config 4 Config 5 Config 5 Config 6 Config 6 GA Hybrid GA Hybrid GA Hybrid GA Hybrid GA Hybrid GA Hybrid Abdallah Ibrahim (PhD Defense) Performance Evaluation and Modeling of SaaS Web Services in the Cloud 45 / 65 �
Recommend
More recommend