Anúncio
Anúncio

Mais conteúdo relacionado

Apresentações para você(20)

Similar a CAdViSE or how to find the Sweet Spots of ABR Systems(20)

Anúncio

Mais de Alpen-Adria-Universität(20)

Anúncio

CAdViSE or how to find the Sweet Spots of ABR Systems

  1. CAdViSE or how to find the Sweet Spots of ABR Systems November 15th, 2021 Babak Taraghi DDRC 2021
  2. ● Introduction & Background ● What is CAdViSE? ○ Components and Architecture ○ CAdViSE In Action ○ Further In-depth Studies ● Summary ● Questions and Answers Agenda
  3. Introduction & Background I 3 ● HTTP Adaptive Streaming (HAS) Is a technique used to deliver the media files from an origin computer to the client, which adapts the delivered media file properties to the current network link conditions. ● Media Players and their ABR Algorithms ABR algorithm is the key function of deciding which bit rate segments to download, based on the current state of the network. ● Significant Network Link Attributes ○ Corrupted Packets ○ Available Bandwidth ○ Delay ○ Packet Loss or Duplicates HTTP/2-Based Methods to Improve the Live Experience of Adaptive Streaming - Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/The-concept-of-HTTP-Adaptive-Streaming-HAS-was-introduced-As-shown-in-Figure-1-video_fig1_283073448
  4. Introduction & Background II 4 ● Quality of Experience (QoE) ○ Is a measure of the delight or annoyance of a customer's experiences with a service. HAS QoE metrics: ■ Start-up delay ■ Delivered Media Quality ■ Stall Events (rebuffering) ○ Mean Opinion Score (MOS) could be measured both Objectively and Subjectively, Predicted MOS and Perceived MOS mentalmind/Shutterstock.com
  5. What is CAdViSE* 5 ● Cloud-based Adaptive Video Streaming Evaluation Framework for the Automated Testing of Media Players ○ A test environment (testbed) which can be instantiated in a cloud infrastructure, examines multiple media players with different network attributes, and conclude the evaluation with visualized statistics and insights into the results. ● Cloud Deployment, Amazon Web Services (AWS) ● Dockerized Environment ● Pluggable Media Players and ABR Algorithms ● Integratable with modern CI/CD pipelines ● Shape the network with real-life network traces (Network Profiles) * Taraghi, B., Zabrovskiy, A., Timmerer, C., & Hellwagner, H. (2020, May). CAdViSE: cloud-based adaptive video streaming evaluation framework for the automated testing of media players. In Proceedings of the 11th ACM Multimedia Systems Conference (pp. 349-352).
  6. CAdViSE* Components and Architecture 6 ● Application Layer ○ Runner, Initializer and Starter scripts ○ Written with Bash Script, Python and Javascript ● Cloud Components ○ Player Container (VNC and Selenium) ○ Network Emulator ○ EC2 Instances, SSM Execution, DynamoDB, S3 and Cloudwatch ● Logs and Analytics ○ Bitmovin Analytic Players Plugin ○ Comprehensive Logs * Taraghi, B., Zabrovskiy, A., Timmerer, C., & Hellwagner, H. (2020, May). CAdViSE: cloud-based adaptive video streaming evaluation framework for the automated testing of media players. In Proceedings of the 11th ACM Multimedia Systems Conference (pp. 349-352).
  7. CAdViSE In Action, Implementation and Application 7 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 ● Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms* ○ Seven well-know ABR algorithms and Media Players ○ 4 Challenging Network Profiles ● Using the streaming session CAdViSE logs; ○ Create JSON files as feed for the QoE Model ○ Stitch back the HAS multimedia segments (audio visual files) and generate a single MP4 ● Participants from Amazon Mechanical Turk (Mturk) ○ 835 Participants in our subjective evaluations ○ 5723 votes in total, out of which 4704 proved to be reliable 1000 2000 3000 4000 5000 6000 7000 8000 10 20 30 40 50 60 70 80 90 100 110 120 k b p s s e c o n d Ramp Up Ramp Dow n 1000 2000 3000 4000 5000 6000 7000 8000 10 20 30 40 50 60 70 80 90 100 110 120 k b p s s e c o n d Stable Fluctuation
  8. Results and Findings I 8 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 Measurement of Significant Metrics for each Media Player or ABR Algorithm Performance in different Network Profiles
  9. Results and Findings II 9 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 Comparison of Predicted MOS against Perceived MOS for each Media Player or ABR algorithm with Ramp Up network profile 2.56 2.67 2.63 2.26 2.84 2.26 2.79 3.62 3.73 3.65 3.45 3.68 3.41 3.73 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 5.00 BBA0 BOLA dash.js Elastic FastMPC Quetra Shaka Pearson's Correlation Coefficient 0.94 Objective MOS Subjective MOS
  10. Results and Findings III 10 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 Comparison of Predicted MOS against Perceived MOS for each Media Player or ABR algorithm with Fluctuation network profile 2.22 1.86 1.99 2.07 1.91 1.98 1.98 3.39 3.21 3.29 3.12 3.10 3.08 3.30 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 5.00 BBA0 BOLA dash.js Elastic FastMPC Quetra Shaka Pearson's Correlation Coefficient 0.52 Objective MOS Subjective MOS
  11. In-Depth Studies (INTENSE) 11 ● Minimum Noticeable Stall event Duration (MNSD) Evaluation. The minimum threshold of a stall event duration that is noticeable by end-users. ● Stall event vs. Quality level switch (SvQ) Evaluation. We assessed the end-user preference regarding these two scenarios. ● Short stall events vs. a Longer stall event (SvL) Evaluation. We studied the impact of multiple short stall events in contrast with a single longer stall event on the QoE from both predicted and perceived MOS perspectives. ● Relation of Stall event impact on the QoE with Video Quality level (RSVQ) Evaluation. ● Objective QoE Models Comparison. * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098.
  12. Stall Events’ Patterns 12 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098.
  13. MNSD Evaluation I 13 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. The decrease of noticed stall events starts from stall events with a duration of less than 0.301 seconds. More than 45% of the subjects could not notice the stall events with a duration of less than 0.051 seconds.
  14. MNSD Evaluation II 14 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. We have determined that any stall event with a duration of less than 0.004 seconds was not noticeable for the participants in the MNSD evaluation.
  15. SvQ & RSVQ Evaluations 15 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. Subjects tend to watch a higher quality version even if it is obtained by adding a stall event with a duration of six seconds. Stall events have a minor penalty on the QoE when the quality of videos is low
  16. SvL Evaluation 16 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. The analysis results demonstrate a preference for a longer stall event over stall events with high frequency but with the same total duration as the longer stall event.
  17. QoE Models Comparison 17 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. ITU-T P.1203 model shows the best performance for all evaluations with the highest PCC and SRCC (more than 0.8) and the most minor RMSE 0.326
  18. Summary 18 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. • Overview of HTTP Adaptive Streaming • Measurement of Quality of Experience • Introducing CAdViSE: cloud-based adaptive video streaming evaluation framework for the automated testing of media players • Showcase of Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms • Another use-case for CAdViSE, Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming
  19. Thank you 19
Anúncio