200 likes | 225 Views
Anatomy of a Personalized Livestreaming System. Bolun Wang , Xinyi Zhang, Gang Wang * , Haitao Zheng, Ben Y. Zhao UC Santa Barbara * Virginia Tech bolunwang@cs.ucsb.edu. Personalized Livestreaming. Travel. Hurricane Matthew. NFL. Interactivity is the Key!.
E N D
Anatomy of a Personalized Livestreaming System Bolun Wang, Xinyi Zhang, Gang Wang*, Haitao Zheng, Ben Y. Zhao UC Santa Barbara *Virginia Tech bolunwang@cs.ucsb.edu
Personalized Livestreaming Travel Hurricane Matthew NFL Interactivity is the Key!
Personalized Livestreaming is Hard Livestreaming Personalized Livestreaming Video Interactions Video CDN Low video delay Viewer Viewer Viewer Viewer Viewer Viewer Viewer Viewer Is scalability going to be a problem?
Outline • Scale, growth, usage pattern • Periscope and Meerkat • Fundamental tradeoffs between scalability and interactivity • Dissection of Periscope • Recent changes of Periscope
Available Data • Interactions within broadcasts • Comments and hearts • Viewer joining • All events have timestamps • Livestream video • Arrival time of each video frame • Social graph • Follower/followee relationship
Data Collection Methodology • Broadcast meta data • Script fully emulating Periscope and Meerkat API calls • Monitor public list for new broadcasts • Interactions within broadcasts • Join new broadcasts and collect interactions • Interacted frequently with Periscope and Meerkat • They are aware of our research • Identify security vulnerabilities
Broadcast Interaction Dataset Size June 15 2015 May 12 2015 May 15 2015 Aug. 20 2015 Time Publicly available Periscope Meerkat • All public broadcasts for 1+ month • 164K broadcasts from 57K broadcasters • 1.9M commentsand 525K hearts • All public broadcasts for 3+ months • 19.6M broadcasts from 1.8M broadcasters • 549M commentsand 7.98B hearts
Scale of Livestreaming Services • Large daily broadcast traffic Periscope: 472k daily broadcast in May 2016 Periscope’s volume tripled in 3 months Meerkat’s volume halved in 1 month Meerkat shut down in Sep. 2016
Broadcasts are Highly Interactive • ~1.3M Periscope broadcasts: >100 comments & >1k hearts ~100x 1.35M hearts 134k comments ~10x • “Commenter limit” is constraining • Especially in popular broadcasts • 57.6k broadcasts: >1k viewers • Trending broadcasts: >700k viewers • More hearts than comments • Caused by “100-commenter limit” • Only first 100 viewers can comment • ≠ viewers want to comment More hearts than comments This limit is correlated with Periscope’s CDN design
Outline • Scale, growth, usage pattern • Periscope and Meerkat • Fundamental tradeoffs between scalability and interactivity • Dissection of Periscope • Recent changes of Periscope
Anatomy of Periscope Broadcaster Broadcaster Video Channel Messaging Channel Viewer Viewer Viewer Viewer Viewer Viewer Viewer Viewer Long latency Near real-time PubNub Wowza Fastly Video + Comment Heart
Livestream Delivery Infrastructure Video frame Video chunk Broadcaster RTMP Fastly Wowza RTMP HLS Scalable But long delay More viewers = High load Viewers Viewers • Receiver polls the server for large video chunks • HLS for larger audience • Sender pushes each video frames to the receiver • RTMP for smaller audience
Implications of the Infrastructure • Only the first 100 viewers can comment. Why? Broadcaster HLS Fastly RTMP Wowza Limit: 100 Long latency ↓ Lagging comment Low latency ↓ Able to comment
End-to-end Delay in Lab Settings • 10 broadcasts in stable labWi-Fi RTMP (Wowza) 1.45s Which components of the system cause the long delay in HLS? HLS (Fastly) 11.67s
Delay Breakdown in Lab Settings Upload Chunking Wowza2Fastly Polling Last Mile Buffering Buffering Last Mile 1.16s RTMP (Wowza) 1.17s 6.88s 3.01s HLS (Fastly) HLS Fastly Viewers Viewers RTMP Wowza Broadcaster
How to Reduce HLS Delay? • Reduce major delay components • Chunking delay: smaller chunks • Polling delay: more frequent polling • Buffering: less buffer • Use measurement to understand • How system parameters affect delay components? • What other effects on the system? • Simulation on broadcasts in the wild • 16K randomly selected broadcasts • Scripts emulate as viewers
Optimizing Client-side Buffering • Simulation: buffering delay vs. playback smoothness • Strategy: pre-buffer P seconds before playback • Stalling time: duration of video freezing • Save 3s buffering delay with low impact on smoothness ~3s
Recent Updates of Periscope • Allow more viewers to comment • Reduce HLS delay so HLS viewers can comment • Major changes of Periscope • Smaller HLS chunks: 3s → 2s • Smaller polling interval: 3s → 1s • Smaller buffer: 9s → 6s • CDN design and protocol design remainsthe same
Periscope Dataset Available • 75 continuous days of Periscope public broadcasts • 13.9M broadcasts • 416M comments • 6.1B hearts • In each broadcast • Meta data: title, time, broadcaster, location, etc. • Interactions: comments, hearts, join, invite, etc.
Dataset is available at http://sandlab.cs.ucsb.edu/periscope http://bit.ly/periscope-data Thank you! Questions