200 likes | 326 Views
Proxy Caching Mechanism for Multimedia Playback Streams in the Internet. R. Rejaie, M. Handley, H. Yu, D. Estrin USC/ISI http://netweb.usc.edu/reza/ WCW’99 April 1, 1999. Motivation. Rapid growth in deployment of realtime streams(audio/video) over the Internet Goals
E N D
Proxy Caching Mechanism for Multimedia Playback Streams in the Internet R. Rejaie, M. Handley, H. Yu, D. Estrin USC/ISI http://netweb.usc.edu/reza/ WCW’99 April 1, 1999
Motivation • Rapid growth in deployment of realtime streams(audio/video) over the Internet • Goals • Maximize the quality of the delivered stream • Minimize startup latency • Low-latency VCR-functionality • Minimize the load on the server & the network
Outline • An End-to-end Architecture • Multimedia Proxy Caching • Conclusion • Future Directions
Streaming Applications in Best-effort Networks (The Internet) • End-to-end congestion control is crucial for stability, fairness & high utilization • Results in variable transmission rate • Streaming applications require constant average consumption rate • Streaming applications should be quality adaptive
Quality Adaptation(QA) • Buffering only absorb short-term variations • Long-lived session could result in buffer overflow or underflow • QA is complementary for buffering • Adjust the quality(rate) with long-term variations • Layered framework BW(t) Time
The End-to-end Architecture Server Client Error Control Quality Adaptation Cong. Control Acker Playback Buffer Internet Buffer Manager Buffer Manager Transmission Buffer Decoder Archive Adaptation Buffer Data path Control path
L 4 L 3 L 2 L 1 L 0 Limitation • Delivered quality is limited to the average bandwidth between the server and client • Solutions: • Mirror servers • Proxy caching Client Client Client ISP Internet Server Quality(layer) Time
Multimedia Proxy Caching • Assumptions • Proxy can perform: • End-to-end congestion ctrl • Quality Adaptation • Goals of proxy caching • Improve delivered quality • Low-latency VCR-functions • Natural benefits of caching Client Client Client Proxy Internet Server
Played back stream Played back stream Stored stream Challenge • Cached streams have variable quality • Layered organization provides opportunity for adjusting quality L 4 L Quality (no. active layers) 3 L 2 L 1 L 0 Time
Issues • Delivery procedure • Relaying on a cache miss • Pre-fetching on a cache hit • Replacement algorithm • Determining popularity • Replacement pattern
Stream is located at the original server Playback from the server through the proxy Proxy relays and caches the stream No benefit in a miss scenario Cache Miss Scenario Client Client Client Proxy Internet Server
Playback from the proxy cache Lower latency May have better quality! Available bandwidth allows: Lower quality playback Higher quality playback Cache Hit Scenario Client Client Client Proxy Internet Server
Missing pieces of the active layers are pre-fetched on-demand Required pieces are identified by QA Results in smoothing Pre-fetched data Played back stream Stored stream Lower quality playback L 4 L Quality (no. active layers) 3 L 2 L 1 L 0 Time
Pre-fetch higher layers on-demand Pre-fetched data is always cached Must pre-fetch a missing piece before its playback time Tradeoff Pre-fetched data Played back Stream Stored stream Higher quality playback L 4 L Quality (no. active layers) 3 L 2 L 1 L 0 Time
Replacement Algorithm • Goal: converge the cache state to optimal • Average quality of a cached stream depends on • popularity • average bandwidth between proxy and recent interested clients • Variation in quality inversely depends on • popularity Client Client Client Proxy Internet Server
Popularity • Number of hits during an interval • User’s level of interest (including VCR-functions) • Potential value of a layer for quality adaptation • Calculate whit on a per-layer basis • Layered encoding guarantees monotonically decrease in popularity of layers whit = PlaybackTime(sec)/StreamLength(sec)
Replacement Pattern • Multi-valued replacement decision for multimedia object • Coarse-grain flushing • on a per-layer basis • Fine-grain flushing • on a per-segment basis Cached segment Fine-grain Quality(Layer) Coarse-grain Time
Conclusion • End-to-end architecture for delivery of quality-adaptive multimedia streams • Congestion control & Quality adaptation • Proxy caching mechanism for multimedia streams • Pre-fetching • Replacement algorithm • State of the cache converges to the optimal
Future Directions • Extensive simulation(using VINT/ns) • e.g. access pattern, the bandwidth distribution • Exploring other replacement patterns • Chunk-based popularity function
Alternative Replacement Algorithm • Goal: to cache popular portion of each stream • Keep track of per-chunk popularity • Identify a victim chuck • Apply the same replacement pattern within the victim chunk