1 / 34

Measuring Congestion Responsiveness of Windows Streaming Media

Measuring Congestion Responsiveness of Windows Streaming Media. James Nichols. Advisors: Prof. Mark Claypool Prof. Bob Kinicki Reader: Prof. David Finkel. Thesis Presentation PEDS - 12/8/03. Network Impact of Streaming Media.

oneida
Download Presentation

Measuring Congestion Responsiveness of Windows Streaming Media

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Congestion Responsiveness of Windows Streaming Media James Nichols Advisors: Prof. Mark Claypool Prof. Bob Kinicki Reader: Prof. David Finkel Thesis Presentation PEDS - 12/8/03

  2. Network Impact of Streaming Media • Unlike file transfer or Web browsing, Streaming Media has specific bitrate and timing requirements. • Typically, UDP is the default network transport protocol for delivering Streaming Media. • UDP does not have any end-to-end congestion control mechanisms.

  3. The Dangers of Unresponsiveness • Flows in the network which are unresponsive to congestion can cause several undesirable situations: • Unfairness when competing with responsive flows for limited resources • Unresponsive flows can contribute to congestion collapse • Some Streaming Media applications use UDP, but rely on the application layer to provide adaptability to available capacity • Performance of these application layer mechanisms is unknown

  4. Intelligent Streaming • Application layer mechanism of Windows Streaming Media (WSM) to adapt to network conditions • Can “thin” streams by sending fewer frames • If the content producer has encoded multiple bitrates into the stream, IS can choose an appropriate one • Chung et al. suggests that technologies like IS may provide responsiveness to congestion, even TCP-friendliness • Performance of Intelligent Streaming is unknown

  5. Research Goals • No measurement studies have been completed where researchers had total control over: • The streaming server • Content-encoding parameters • Network conditions at or close to the server • We seek to characterize the bitrate response function of Windows Streaming Media in response to congestion in the network. • Want to precisely quantify relationship between content encoding rate and performance.

  6. Outline • Introduction • Related Work  • Methodology • Results & Analysis • Conclusions • Future Work

  7. Related Work • Some research has been done in the general area of Streaming Media: • Traffic characterization studies [VAM+02, dMSK02] performed through log analysis • Empirical studies using custom tools [CCZ03, WCZ01, LCK02] • Characterization of streaming content available on the Web [MediaCrawler] • None had control of the server, client, and network conditions

  8. Need control over the server • Not having server limits possible data set of content to study • For example, [LCK02], measured IP packet fragmentation when streaming WSM clips but packet size can be tuned server-side • Other research [CCZ03] had to stream over the public Internet while measuring network performance

  9. Outline • Introduction • Related Work • Methodology  • Results & Analysis • Conclusions • Future Work

  10. Methodology • Construct testbed • Create/adapt tools • Encode content • Systematic control • Examine SBR clip • Range of SBR clips • MBR clips • Vary loss and latency

  11. Results and Analysis Single Bitrate Clip

  12. Experiments • Single bitrate (SBR) clip in detail  • Range of SBR clips • Multiple bitrate (MBR) clips • Additional experiments performed but not discussed here

  13. Single Bitrate Clip Experiment • Hypothesis: SBR clips are unresponsive to congestion • Latency: 45 ms • Induced loss: 0% • Bottleneck capacity: 725 Kbps • Start a TCP flow through the link • 10 Seconds later stream a WSM clip • Measure achieved bitrates and loss rates for each flow

  14. 340 Kbps Clip - Bottleneck Capacity 725 Kbps TCP- Friendly? < 0.001 packet loss After 15 seconds

  15. 548 Kbps Clip - Bottleneck Capacity 725 Kbps Not TCP- Friendly! ~ 0.003 packet loss for WSM ~ 0.006 packet loss for TCP after 15 seconds

  16. 1128 Kbps Clip - Bottleneck Capacity 725 Kbps Responsive!

  17. Network Topology

  18. Measuring Buffering Performance • Parse packet capture for RTSP PLAY message • Examine MediaTracker output and measure how long it took from the start of streaming to when the buffer is reported to be full • PLAY + interval = buffering period

  19. Experiments • SBR clip in detail • Range of SBR clips  • MBR clips

  20. Comparison of Single Bitrate Clips • Want to precisely quantify relationship between content encoding rate and performance • Repeat the previous experiment, but for a range of single bitrate clips: • 28, 43, 58, 109, 148, 282, 340, 548, 764, 1128 Kbps • Vary network capacity: 250, 750, 1500 Kbps • Measure performance during and after buffering

  21. SBR Clips - Bottleneck Capacity 725 KbpsBuffering Period

  22. SBR Clips - Bottleneck Capacity 725 KbpsPlayout Period

  23. Results and Analysis Multiple Bitrate Clips

  24. Multiple Bitrate Clips • Hypothesis: Multiple Bitrates make WSM more responsive to congestion • Same experiment as before, but with different encoded content • Vary network capacity: 250, 725, 1500 Kbps • Created two sets of 10 multiple bitrate clips • Experiments with lots of other MBR clips

  25. Multiple Bitrate Content • Second set of clips (adding higher): • 28 Kbps • 28-43 Kbps • 28-43-56 Kbps • … • 28-43-58-109-148-282-340-548-764-1128 Kbps • First set of clips (adding lower): • 1128 Kbps • 1128-764 Kbps • 1128-764-548 Kbps • … • 1128-764-548-340-282-148-109-58-43-28 Kbps

  26. Adding lower bitrates to clip - 250 Kbps Bottleneck Capacity - Buffering Period

  27. Adding lower bitrates to clip - 250 Kbps Bottleneck Capacity - Playout Period

  28. Adding lower bitrates to clip - 725 Kbps Bottleneck Capacity Buffering Playout

  29. Adding higher bitrates to clip - 725 Kbps Bottleneck Capacity Buffering Playout

  30. Additional experiments • Not enough time to discuss all the results • Different bottleneck capacities • Carefully choose 2 or 3 bitrates to include in MBR clips • Vary loss rate • Vary latencies • Also look at other network level metrics: interarrival times, burst lengths, and IP fragmentation

  31. Conclusions • Prominent buffering period means WSM cannot be modeled as a simple CBR flow • WSM single bitrate clips: • During buffering WSM responds to capacity only when the encoding rate is less than capacity • Otherwise, high loss rates are induced • During playout WSM responds to available capacity • Thin if necessary • If rate is less then capacity, will still be responsive to high loss rates (5%)

  32. Conclusions • WSM multiple bitrate clips: • During buffering WSM responds to capacity only when content contains a suitable bitrate to choose • Chosen bitrate is largest that capacity allows • Otherwise, still tries to fit the smallest available, again resulting in high amounts of loss • During playout WSM is responsive to available capacity • Either because it chose the proper rate, or because it thins if proper rate isn’t encoded in clip • However, the chosen bitrate probably isn’t fair to TCP

  33. Future Work • Run the same experiments with other streaming technologies: RealVideo and Quicktime • Examine the effects of different content types • Build NS simulation model of streaming media for use in future research

  34. Questions

More Related