www.nr.no The MOVIS project Performance Monitoring System for Video Streaming Networks Wolfgang Leister April 2006 MOVIS – Participating Institutions Research Project supported by Research Council of Norway ► TV2 Interaktiv, Bergen ► Norsk Regnesentral, Oslo ► IRT, München ► Nextgentel, Sandsli ► Lyse Tele, Stavanger ► Never.no, Oslo ► Secrets to Sports, Asker ► Nettfokus, Lundesæter ► Nimsoft, Oslo www.nr.no
The Challenge ► Content Providers distributes content to ▪ End-users as multimedia streams ▪ ISPs for redistribution ► The ISPs stream to end-users from own servers ► In case of degradations of QoS in delivery: ▪ the end-user gets reduced visual quality. ▪ Communicate problem before user reacts. ▪ Help content providers / ISPs to identify problems. ▪ Logs for quality assurance. www.nr.no
Streaming Content Distribution Video Encoders Databases Source Camera Encoder Video File Server Camera Encoder Live Distrib. Server ... AV Matrix Default Server Internet Provider ... Provider 1 ISP Net Provider 2 Provider 3 www.nr.no
Infrastructure at ISP Content Provider Cont Provider Default Server Internet Router / Switch DSLAM Router / Switch Router / Streaming Server Switch File Server www.nr.no End user equipment scenarios Digital TV One PC Access point Router Home Network WLAN WLAN Viewing Characteristics ... www.nr.no
Considerations for MOVIS Online measurements of end-user QoS for streamed content ► Emphasis on end-to-end performance ► Measure / collect perceived and calculated QoS ► Report QoS values back to content providers and ISPs ► Metrics, protocols, architectural issues ► Measurements in all parts of delivery-chain ► End user is in mass market ► End user interaction ► End user equipment configuration beyond control of providers ► Internal structures of ISPs must not be revealed ► www.nr.no How to solve problem? ► Research ▪ Metrics for measuring video quality ▪ Subjective / Objective assessment of video quality ▪ Relations between technical and perceived QoS ► Development ▪ Agent-based system to report perceived video quality to ISP and content provider. ▪ Interfacing with systems in use ► Evaluations ▪ Field trial, user studies www.nr.no
Test Methods Original Content Streaming Transmission Processed Content Encoder Decoder Server Network Measurement Obj.Quality Rating System Picture Comparison Test method Original Content Streaming Transmission Processed Content Encoder Decoder Server Network Feature Extraction Feature Comparison Impairment Par. Feature Extraction Feature Extraction Test Method Streaming Transmission Original Content Processed Content Encoder Decoder Server Network Monitoring System Impairment Par. + Model Single Ended Test method www.nr.no Characteristics of Test Methods user – Subjective Objective – technical vs picture – Directly Indirectly – signal vs In-service Out-of-service vs Real-time Deferred time vs Continuous Samples vs www.nr.no
QoS measurements / metrics ► Collecting QoS observations ▪ Intrusive (controlled injection of content) ▪ Non-intrusive (passive observation) ► QoS metric carefully specified quantity related to the performance and reliability of the service that we would like to know the value of. ► Networking QoS metrics ► Perceived QoS ▪ Laboratory ▪ Real-time www.nr.no QoS measurements / metrics ► Perceived QoS metrics ▪ Impact of networking characteristics ▪ Impact of codec characteristics ► Assessment ▪ Subjective assessment (end user evaluations) ▪ Objective assessment (technically deducted) ► Methods ▪ DVB: ETR 290, TR 101 290 (Measurement guidelines for DVB systems), TR 101 291 ▪ BT.500 ▪ SAMVIQ – BT.700 www.nr.no
QoS measurements / metrics ► Networking QoS metrics ▪ Connectivity, one-way delay, two-way delay, throughput, loss, jitter, ... ▪ Initiatives: ◦ IETF IPPM (IP Performance metrics) RFC 2330, RFC 2680, RFC 2681, IPPM, WOAMP, ... ◦ ITU-T G.107: Rating of transmission quality e2e. ► Nettfokus – mobile SLM ▪ http://www.knowyourSLA.com/ ► NimBUS ▪ http://www.nimsoft.com/ www.nr.no NIMSOFT ► Architecture built around message bus (NimBUS) ► All types of traffic / networks ► API libraries available ► Agent-based www.nr.no
Nettfokus / MobileSLM Management solution for SLM www.knowyoursla.com ► Designed for multimedia traffic ► Dedicated machines inject traffic ► www.knowyoursla.com Agent-based ► Internet Firewall Master enterprise Firewall networks Slave Slave Slave www.nr.no APDEX ► Numerical Measure of end user satisfaction ► Ratings range from unacceptable to excellent ► Used for response times ► Three end-user categories ▪ #Satisfied ▪ #Tolerating ▪ #Frustrated www.nr.no
G.107 – E-model - VQM ► Need models for establishing relation between objective and subjective quality. ► G.107 / E-model is example for audio ► VQM for video ▪ Document: NTIA-report 02-392 – Video Quality Measurement Techniques by Pinson and Wolf. http://www.its.bldrdoc.gov/n3/video/documents.htm Developed by ITS / NTIA of the U.S. Dept. of Commerce ▪ www.nr.no G.107 / E-Model ► Model for audio / IP-telephony ► Uses impairment factors ► Related standards: ▪ G.108 ▪ G.113 G.175 ▪ ▪ G.562 ▪ ... www.nr.no
VQM – Video Quality Metric ► Objective measurement method ► How and what to measure ► Formulas to calculate VQM value ► Relation VQM ↔ subjective www.nr.no The MOVIS-factor M MOVIS Ancilliary Channel MOVIS-factor M= Original Content +M A (...) M O Encoder Codec, parameter settings, Advantage factor encoder type, ... -M S (Codec,Cont) Encoded Content Streaming server Streaming Server type, -M E (Codec) parameter settings, ... Network Cont Prov. Network Networking parameters, ISP Network -M N (Codec) topology, type, ... User Network -M U (Codec) User Equipment Type, parameters ... -M V (Cont) Viewing conditions End User (not part of MOVIS) www.nr.no
SAMVIQ – ITU-R BT.700 Standard for Video Quality Assessment in multimedia ► SAMVIQ submitted as draft standard to ITU-R SG6 WP 6Q ► Build on experiences from MUSHRA (audio quality mmt) ► Uses hidden reference, low anchor, user knows total scale, ► possibility to change vote, scale from 0 to 100. High reliability, comparability w/ other test labs, absolute results ► Use scenarios: ► ▪ Measure impact of different bit rates (MOVIS WP2) ▪ Comparison of codecs (MOVIS WP2) ▪ Minimum rate for specified quality (MOVIS WP2) ▪ Impact of network errors (MOVIS WP3) www.nr.no
SAMVIQ: GUI used for sessions www.nr.no SAMVIQ: Structure of test sessions Hidden 1. sequence Reference Algo. 2 Algo. 1 Algo. 3 Reference ref A B D C Hidden 2. sequence Reference Algo. 1 Algo. 2 Algo. 3 Reference ref A B C D Hidden k. sequence Reference Algo. 1 Algo. 3 Algo. 2 Reference ref A B C D Example: Algo. 1: WM9, CIF,168kbps Algo. 2: WM9, CIF, 1032kbps www.nr.no
Sequences used in assessment Scenes: Skiing – bright Rugby – less details Rainman – high frequencies Barcelona - colourful www.nr.no Results: influence of content on quality Windows Media 9 CIF format all sequences 1 0 0 Excellent G lo b a l ( M e a n + C I ) S k iin g ( M e a n + C I ) R u g b y ( M e a n + C I ) R a in m a n ( M e a n + C I ) 8 0 B a r c e lo n a ( M e a n + C I ) Good 6 0 Mean Score Fair 4 0 Poor 2 0 Bad 0 0 4 0 0 8 0 0 1 2 0 0 Total bit rate (kbps) www.nr.no
Recommend
More recommend