260 likes | 342 Views
Analytic Evaluation of Quality of Service for On-Demand Data Delivery. Hongfei Guo (guo@cs.wisc.edu) Haonan Tan (haonan@cs.wisc.edu). Outline. Background Two Multicast Protocols Customized MVA Analysis Validation Model Improvement (Interpolation)
E N D
Analytic Evaluation of Quality of Service for On-Demand Data Delivery Hongfei Guo (guo@cs.wisc.edu) Haonan Tan (haonan@cs.wisc.edu)
Outline • Background • Two Multicast Protocols • Customized MVA Analysis • Validation • Model Improvement (Interpolation) • Evaluation of Different Multicast Protocols • Conclusion & Future Work CS747 Project Presentation
Background • Eager et al. reasoned minimum bandwidth requirements. But – • How about Quality of Service ? – Balking probability – Waiting time Given: – server bandwidth – multicast protocol CS747 Project Presentation
Two Multicast Protocols • Grace Patching – Shared multicast stream (current data) – Unicast “patch” stream (missed data) – Average required server bandwidth CS747 Project Presentation
Two Multicast Protocols (cont’d) • Hierarchical Multicast Stream Merging – Each data transmission stream is multicast – Clients accumulate data faster than file play rate – Clients merged into larger and larger groups – Once merged, clients listen to the same streams – Average required server bandwidth CS747 Project Presentation
CMVA Analysis • Customer Balking Model • Fixed number of streams in the server • An arriving customer leaves if no streams available • Customer Waiting Model • Fixed number of streams in the server • An arriving customer waits till it being served • Customers with same request coalesce in the waiting queue CS747 Project Presentation
Input Parameters • C server capacity • external customer arrival rate • M number of file categories For i = 1, 2, …, M • Ki the total number of distinct files in category i • Ti mean duration of the entire file in category i • i zipfian parameter in category i • Pi probability accessing category i files CS747 Project Presentation
Output Parameters (Balking) • S1 average service time at center 1 • R0 mean residence time at center 0 • X system throughput. For i = 1, 2, … #files on the server • pi fraction of customer requests for file i • Ci’ average b/w for file i • S1i mean service time of file i streams at center 1 • S0 mean service time at center 0 • Q0 mean queue length at center 0 • Xi throughput of streams serving file i • PB mean incoming costumer balking probability CS747 Project Presentation
Output Parameters (Waiting) • W mean waiting time for a request (not coalesced) • U system utilization • S overall mean stream duration estimate For i = 1, 2, …, #files on the server • pi fraction of customer requests for file i • Si mean stream duration for file i • Qi mean number of waiting requests (not coalesced) for file i • Xi mean throughput of requests (not coalesced) for file i • Ri mean residence time of a request (not coalesced) for file i • Ci’ average number of active streams for file i • Ri’ mean residence time adjusted for coalescing • Wi’ mean waiting time adjusted for coalescing CS747 Project Presentation
Center 1 … C streams X Center 0 (1) Customer Balking Model • Center 0 – SSFR center – Represent the waiting state of a stream • Center 1 – Delay center – Represent the active state of a stream CS747 Project Presentation
CMVA Equations (Protocol result) (interarrival time) CS747 Project Presentation
X C streams (2) Waiting Model • Center 0 – multi-channel server with C streams • Two kinds of measurements (from two perspectives) • Server only see non-coalesced customer requests • Customers count in both coalesced and non-coalesced requests. Center 0 CS747 Project Presentation
CMVA Equations • Measurements for the server CS747 Project Presentation
CMVA Equations (cont’d) • Measurements for the customers CS747 Project Presentation
Validation (1) CS747 Project Presentation
Validation (2) CS747 Project Presentation
Validation (3) CS747 Project Presentation
Comparison of Patching Results Average Stream Durationa – Big error here! CS747 Project Presentation
Interpolation of Stream Duration • g(Ni) – Threshold for patching • Exact for two extreme cases: Wi or Wi 0 • Exact for other cases ??? CS747 Project Presentation
Evaluation of Two Protocols(1) CS747 Project Presentation
(2) CS747 Project Presentation
(3) CS747 Project Presentation
(4) CS747 Project Presentation
Conclusion • Balking model – big relative error when utilization is low. • Waiting model – good for HSMS, but underestimates Patching when utilization is high. • Interpolation helps ! • C* is a good trade-off between QoS and server utilization. • HSMS is always better than Patching. CS747 Project Presentation
Future Work • Further investigate the discrepancy between model results and simulation results • Use the models to evaluate QoS of stream servers with multiple categories CS747 Project Presentation
Comparison of Patching Results (1) Coalesce Fraction CS747 Project Presentation