user behavior analytics for video streaming qoe assessment
play

User-behavior analytics for video streaming QoE assessment Ricky - PowerPoint PPT Presentation

User-behavior analytics for video streaming QoE assessment Ricky K. P. Mok The Hong Kong Polytechnic University AIMS2016 1 Measuring the QoE is hard! AIMS2016 2 A simple QoE model Playback smoothness, picture quality, Quality of


  1. User-behavior analytics for video streaming QoE assessment Ricky K. P. Mok The Hong Kong Polytechnic University AIMS2016 1

  2. Measuring the QoE is hard! AIMS2016 2

  3. A simple QoE model Playback smoothness, picture quality, Quality of Experience expectation, past experiences, usage habit … ? Start-up delay, rebuffering events, Application layer metrics quality level switches … RTT, packet loss rate, throughput … Network path metrics AIMS2016 3

  4. QoE Crowdtesting • Lab-based subjective assessment Emulated network environment Experimenter Video streaming Subjects server • Crowdsourcing-based subjective assessment Task Experimenter Network/Appl. performance Internet Video streaming MOS Workers Workers server AIMS2016 4

  5. User behavior analytics • User behavior reflects cognitive processes • Generated by users • User-viewing activities • Improve QoE inference • Worker behavior • Detect low-quality workers in QoE crowdtesting AIMS2016 5

  6. QoE inference • User-viewing activities can be triggered by the reaction to impairment events. AIMS2016 6

  7. QoE inference • User-viewing activities • Pause/Resume • Change of player sizes • Reload • … AIMS2016 7

  8. QoE inference • Correlate the occurrence of the activities with perceivable impairments • Quantify the activities into metrics • Model the QoE using application layer metrics and the user-viewing activities AIMS2016 8

  9. Findings • An event can be triggered within few seconds after some application events. • Pause • Reduce the screen size • Compare two models 1. Start-up delay, Rebuffering frequency, and Rebuffering duration 2. Model 1 + No. of Pause, and reduce the screen size • The explanatory power is significantly increased by 8%. AIMS2016 9

  10. Detecting low-quality workers • Worker-behaviour on the question page • Clicks • Time delay between two question • Mouse cursor movement • Trajectory • Speed/Acceleration • Low-quality workers behave differently normal workers • A model can be trained to filter the workers who cheat the system AIMS2016 10

  11. An example AIMS2016 11

  12. Findings • Ten worker behavior metrics are designed • 80% of low-quality worker can be detected • Comparing to CrowdMOS, our method has • Lower false positives and false negatives • Our method is independent of the ratings • More suitable for measuring the QoE AIMS2016 12

  13. A new model Subjective factors Quality of Experience User behavior analytics Application layer metrics Mouse cursor movement, clicks, Pause/Resume, … Network path metrics AIMS2016 13

  14. Challenges • How can user behavior be incorporated into measurement infrastructures? • The user behavior can be application-specific and platform-specific (desktop vs. mobile) • Collaboration from either service providers or users is required to collect the user behavior • Privacy issue? AIMS2016 14

  15. Thanks oneprobe.org cs.rickymok@connect.polyu.hk AIMS2016 15

Recommend


More recommend