280 likes | 637 Views
Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps for RG Cloud. Michael Faber michael.faber@kit.edu Samuel Kounev kounev@kit.edu. http:// descartes.ipd.kit.edu. http:// research.spec.org. Agenda. Definitions and mission statement
E N D
Summary ofExistingCloud Benchmark Effortsand Proposed Next Stepsfor RG Cloud Michael Faber michael.faber@kit.edu Samuel Kounev kounev@kit.edu http://descartes.ipd.kit.edu http://research.spec.org
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Cloud Computing • Essential characteristicsdefinedby NIST • On-demandself-service • Broadnetworkaccess • Resourcepooling • Rapid elasticity • Measuredservice • Different abstraction levels • SaaS (e.g., SalesForce.com, Google Docs) • PaaS (e.g., MS Azure, Google AppEngine) • IaaS (e.g., Amazon EC2, Rackspace) Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
OSG Cloudand RG Cloud– Mission and Charter • Currentefforts in OSG Cloudand RG Cloud • Taxonomyofthecloudspace • Representativeworkloads • Relevant metrics • OSG Cloud • Developready-to-usebenchmarks • Evaluation andcomparisonofcloudproductofferings • RG Cloud • Definebenchmarkingscenariosat a higherabstractionlevel • Evaluation ofearlyprototypesandresearchresults • In-depth quantitative analysis • Providebasisforbuildingfutureconventionalbenchmarks Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Benchmarks • „A benchmarkis a test, orsetoftests, designedtocomparetheperformanceofonecomputersystemagainsttheperformanceofothers“ [SPEC] • Typesofbenchmarks • Syntheticbenchmarks • Micro-benchmarks • Programkernels • Applicationbenchmarks Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Research Benchmarks • Targetedforuse in researchenvironments • Can beusedas a basisforbuildingconventionalbenchmarks Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
RG Cloud Group – Last Meetings 18.05. „Live Migration Benchmark Research“ (ZhejiangUniversity) 01.06. „HowA Consumer Can MeasureElasticityfor CloudPlatforms“ (NICTA) 15.06. „Benchmarking Cloud Services“ (SAP) 29.06./27.07. „Towardsa Benchmark fortheCloud“ (UC Berkeley) 10.08. „CloudCmp: ComparingPublic Cloud Providers“ (IBM) 07.09. „Virtualizationon Network Performance ofEC2“ (Rice University) 21.09. OSG Cloudstatus(AMD) Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Overviewof Work Areas • Overviewofexistingcloudbenchmarkingefforts • Cloudtaxonomy– Different patternsofcloudcomputing Benchmark scenarios / Applicationtypes Workloaddriver Metrics Controller System UnderTest (SUT) Not yetdiscussed Initial discussion Detaileddiscussion Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Cloud Benchmark Metrics • Consideredmetrics so far • Elasticity (provisioninginterval) • Durability • Response time • Throughput • Reliability • Power • Price • Performance/Price • Degreeofinteresthighlydependson thetargetgroup • IaaSprovider (power / resourceutilization,…) • End-user (response time, price,…) • … Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
CloudApplicationTypes- ResultsofLiterature Research • Data-Intensive / Planned Batch Jobs • Business intelligence, data warehousing, data analytics, HPC, etc. • Processing Pipelines • Search engines (e.g., crawler), etc. • Dynamic Web-Sites • E-commerce, social networks, marketing sites, education / e-learning, etc. • Business Processing / OLTP / Mission-Critical Applications • CRM, ERP, enterprise portals, BPM, etc. • Latency-Sensitive • Conferencing tools, streaming, online gaming, media broadcasting, etc. • Application-Extensions / Backendsfor Mobile Communication • Mobile interactive applications, extension of compute-intensive desktop applications, etc. • Bandwidth- and Storage-Intensive • Data storage (Dropbox, iCloud), video and photo sharing (YouTube, Flickr), etc. • Mail Applications • Others • Application platforms & development, testing, etc. Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Cloud Application Types- Results of Discussions in OSG Cloud • Data Analytics 4.43 • Expert search • Clustering • Customer segmentation • Data Warehousing 4.00 • Pipelines • Iterative processing • Business OLTP 3.56 • Mail 3.43 • Memory Cloud 3.00 • Key-value pair databases • Social Networking 3.00 • Web2.0 based application • Write/read workload • Memory cloud • Search engine 1 (doesn‘t matter) - 7 (mostimportant) Source: OSG Cloud Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
CloudApplicationTypes *IBM rank 3: „Business continuity / disasterrecovery“ Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Benchmarks TargetedatCloud Computing • Yahoo YCSB • CloudBench • CloudStone • CloudCmp • Cloudsleuth • Global Provider View • Cloud Performance Analyzer Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Focus on datastorageandmanagement Yahoo YCSB Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Transaction processing (OLTP) Simple HTTP request/response pattern CloudBench Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Cloudstone Typical Web 2.0 application in cloudcomputingenvironment Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
CloudCmp Case studiestoevaluatethebenchmarkresults (e.g., TPC-W) Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Global Provider View • MetricsforIaaSandPaaSproviders • Different regional views(averagesfromclosebackbones) Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
CloudPerformance Analyzer • Influenceofcontentdeliverynetworks (CDN) • Onedeployedtargetapplication on Amazon EC2 (eastcoastsite) Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
LessonsLearned • Existingcloudbenchmarksdiffer in all essential points • Goals ofthebenchmarks • Target cloudsystems • Metrics, scenarios, workloads • Requirementsforresearchbenchmarks • Flexibleforadaptingto different cloudsolutionsandpatterns • Covering a widerangeof relevant applicationtypes • Support different workloadtypes(e.g., linear, exponentialincrease, peaks) • Flexible in customizingtheworkloadbased on thegoalsoftheevaluation • … Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
IdentifiedChallenges • Comparabilityofofferings • Different abstractionlevels (e.g., IaaS vs. PaaS) • Highlydiffer in theirofferings (e.g., auto-scaling) • Reproducabilityofresults • Dynamic assignmentsofresources • Contentionbyothercustomers • SUT • SUT isvarying (e.g., duringthebenchmarkthroughscaling) • Carved-out portion vs. as-isportion • Metrics • Howtodefine, measureandquantifyelasticity? • Howtoconsidercosts in themetrics? • Workloads • Cloudcomputingallowshostingof a varietyofapplications • Representativeworkloadsformost promising applicationtypes • Network impact • Wheretodeploytheworkloaddriver? • Howtodistinguishbetweennetworklatencyandcloudperformance? Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Agenda • Definitionsandmissionstatement • Summary ofpresentations so far • Cloudapplicationtypes • Existingcloudbenchmarks • Challengesforcloudbenchmarking • Proposednextsteps Possibletopicofnextmeeting • Taxonomy – patterns in cloudcomputing Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps
Proposed Next Steps 1. Taxonomyforthecloudspace • Betterunderstandingof different offeringsandcloudpatterns • Consistentterminologyas a basisforfurtherdiscussions 2. Systematicclassificationof different cloudbenchmarks • Whatisneededatwhichlayerandbywhom? • Howcanthisbemeasured? 3. Appropriatescenariosforresearchbenchmarks • Promising applicationtypes • Simple and easy tounderstand • Accomodatemorethanonecloudpattern • Continueclosecollaborationswith OSG Cloud Michael Faber - Summary of Existing Cloud Benchmark Efforts and Proposed Next Steps