90 likes | 104 Views
Measuring and Evaluating Large-Scale CDNs Huang et al. An offensive analysis Daniel Burgener John Otto F'09 - EECS 440 Adv Net - 7 Oct 2009. Argument. Comparing lemons and limes Fundamental limitations of evaluation with DNS Ignored important metrics Flawed definition of metrics
E N D
Measuring and Evaluating Large-Scale CDNsHuang et al. An offensive analysis Daniel Burgener John Otto F'09 - EECS 440 Adv Net - 7 Oct 2009
Argument Comparing lemons and limes Fundamental limitations of evaluation with DNS Ignored important metrics Flawed definition of metrics Availability Delay Does not provide a general solution
Comparing lemons and limes Difference in scale of Akamai and Limelight Doesn't effectively evaluate differences in architectures and philosophies Paper withdrawn after criticism from Akamai and Limelight Paper makes false assumptions about akamaiedge
Fundamental limitation:measuring with DNS Don't see actual user performance Based on locality assumption: Client is near LDNS Does not capture traffic volume per DNS server Difficult to determine average performance (ack'd in last paragraph of conclusion)
Ignored important metrics Throughput Load balancing algorithms / server load CDN interface with ISPs: Peering? Paper couldn't include these because it used DNS for evaluation However, these aspects may be just as important as latency and uptime
Defining Metrics: Availability Sec. 5 defines availability in terms of a particular server being online Instead, should be from the user's perspective: that the CDN returns ANY live node Why this is important: Expect that load balancing algorithm is tightly tied to server availability A DNS request should return live servers, with short DNS mapping TTL (So, we shouldn't have stale references to servers) Availability should be defined: if one of the two CDN nodes are available.
Defining Metrics: Delay DNS-perceived latency != user-perceived latency How can you evaluate delay if it's not from the user's perspective? Since Akamai has servers in POPs, they may be much closer to the LDNS than the client Result: underestimating latency
Not a general solution Multiple stages of the analysis were special cases for Akamai and Limelight Difficult to get CNAME list for another CDN that's not as organized as Akamai Got lucky with ease of collecting IPs for Akamai and Limelight What about CoralCDN?
Conclusion • Comparing Akamai and Limelight isn't fair - so different • Might be justified if they had actually discussed the philosophical and architectural differences • Paper should have conducted evaluation from the users' perspective, not DNS • As a result, couldn't include important metrics • Used poorly-defined metrics • Doesn't provide a general solution • Approach and methodology are flawed • Isn't interesting: doesn't compare CDN approaches