1 / 17

Video Quality Assessment and Comparative Evaluation of Peer-to-Peer Video Streaming Systems

Video Quality Assessment and Comparative Evaluation of Peer-to-Peer Video Streaming Systems. Sachin Agarwal Jatinder Pal Singh Deutsche Telekom A.G., Laboratories Berlin, Germany. Aditya Mavlankar Pierpaolo Baccichet Bernd Girod Stanford University Stanford CA, USA. Outline.

lidiaa
Download Presentation

Video Quality Assessment and Comparative Evaluation of Peer-to-Peer Video Streaming Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Video Quality Assessment and Comparative Evaluation of Peer-to-Peer Video Streaming Systems Sachin Agarwal Jatinder Pal Singh Deutsche Telekom A.G., Laboratories Berlin, Germany Aditya Mavlankar Pierpaolo Baccichet Bernd Girod Stanford University Stanford CA, USA

  2. Outline • Introduction to P2P live video streaming • Prior work on system performance assessment • Test-bed setup • Performance of tested systems

  3. P2P Live Video Streaming • Extension of P2P file-sharing • Low-cost and scalable delivery mechanism • Several deployed commercial implementations today • Increasing content / channels available

  4. Related Work on Performance Assessment • Networking related metrics, e.g. bandwidth usage, packet loss, continuity index, etc. • CoolStreaming [Zhang et al., 2005]: PlanetLab • PPLive [Hei et al., 2006]: packet sniffing and crawling • SopCast [Sentinelli et al., 2007]: “watching”, PlanetLab • . . . • No video PSNR results • No repeatable test conditions • Network conditions • Encoded video characteristics • Peer behavior • No fair head-to-head comparison of different systems

  5. Test-Bed Setup 576 X 9 1024 X 5 2048 X 1 Test center Berlin, Germany Server 1, 2 Berlin, Germany (15) [Emulated HS Broadband] PLR, delay, jitter and bandwidth measured for representative real connections and emulated using NISTNet traffic shaper 52 Mbps 576 X 16 1024 X 5 2048 X 1 Internet Stanford, CA (22) [Emulated HS Broadband] ISP Datacenter Erfurt, Germany 128 X 2 192 X 2 576 X 2 1024 X 2 Berlin, Germany (8) [Real HS Broadband] TU Munich, Germany (3) [Emulated HS Broadband] 3072 X 3

  6. Encoded Video Stream • La Dolce Vita (Fellini, 1960) • 24 fps, 352x240 pixels • H.264/AVC video codec, 400 kbit/sec CBR bitstream, 42 dB PSNR • I B B P B B P B B P . . . (I frame every second) • H.264 bitstream wrapped in Microsoft ASF container, if required by tested system • Last frame error concealment

  7. Peer Churn Model • 30-minute simulation run • During each 6-minute time-slot • Peer on with probability 0.9 • Peer off with probability 0.1 • Peer can switch off for the rest of the run with probability 0.05 • During last 5 minutes, peer off with probability 0.5

  8. Representative Results • Tested systems • System A: Tree-based, push approach • System B: Mesh-based, data-driven or pull approach • Emulation runs • Run 1: with traffic shaping (using NISTNet) • Run 2: without traffic shaping • Same realization of peer On-Off model for all runs

  9. Pre-Roll Delay about 30 sec enough for System A (tree-based) about 60 sec enough for System B (mesh-based)

  10. PSNR Drop (w/ traffic shaping) System A (tree-based) System B (mesh-based) 32

  11. PSNR Drop (w/o traffic shaping) System A (tree-based) System B (mesh-based) 32

  12. Statistics of Frame Freezes • Frames frozen (as percentage of total frames to be displayed) • Average no. of distinct frame-freeze events per client in 30 min.

  13. Statistics of Frame Freezes (cont.) System A (tree-based) employs content-aware prioritization Long frame freezes more likely with System B (mesh-based)

  14. No. of Peers Failing to Decode a Frame System A (tree-based), Run 1 System B (mesh-based), Run 1

  15. Redundancy, Server Load and Parent-Peer Analysis • Redundancy (bytes received in excess of required video stream bytes) • System A (tree-based): 6% in both runs • System B (mesh-based): 35% and 20% in Runs 1 (w/ traffic shaping) and 2 (w/o traffic shaping) respectively • For both Systems, peer receives on average less than 10% of its data directly from the server; slightly more for Run 2 of System B • System A (tree-based): Sustained downloads from lower number of parent peers

  16. Summary • Proposed methodology allows measuring video PSNR, buffering time, frame-freeze statistics, peers failing to decode a frame, etc. beyond network usage, packet loss, etc. • Test conditions chosen by analyzing real-world conditions and experiments are repeatable • Tested three commercial-grade P2P video streaming systems • Room for improvement in current systems: • Long buffering time (10s of seconds) • Display freezes for more than 100 frames • Tested tree-based system outperforms mesh-based system: • Redundancy • Buffering time

  17. Thank you!http://www.stanford.edu/~maditya/publication.htmlRelated:[Agarwal, et al., TRIDENTCOM 2008]

More Related