1 / 8

“Simulating a $2M Commercial Server on a $2K PC”

“Simulating a $2M Commercial Server on a $2K PC”. Alaa Aladmeldeen , Milo Martin, Carl Mauer , Kevin Moore, Min Xu , Daniel Sorin , Mark Hill and David Wood Computer Sciences Department, University of Wisconsin-Madison & Department of Electrical and Computer Engineering, Duke University.

xerxes
Download Presentation

“Simulating a $2M Commercial Server on a $2K PC”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Simulating a $2M Commercial Server on a $2K PC” AlaaAladmeldeen, Milo Martin, Carl Mauer, Kevin Moore, Min Xu, Daniel Sorin, Mark Hill and David Wood • Computer Sciences Department, University of Wisconsin-Madison & Department of Electrical and Computer Engineering, Duke University Presented by Deepak Srinivasan

  2. Motivation • DBMS and web servers areCommercial Workloads, which are growing in popularity • Lots of services rely on them, and thus it’s important to validate • Validate • Execution-based simulator • Hardware Prototypes • Analytic Models • Getting multi-million dollar multi-processor servers is probably too high a price tag for research groups testing workloads

  3. Details • Trying to test simulation on a $2000 PC instead • 1st Goal: Good approximation of workloads • Scaling and fine tuning benchmarks which led to the Wisconsin Commercial Workload Suite • Online Transaction Processing (OLTP), Java Middleware, Static Web Content Server, Dynamic Web Content Server • Made several improvements in OLTP to improve its performance (12x improved) • Allowed for smaller PC to have similar throughput to that of the servers

  4. More Details • 2nd Goal: Reasonable Simulation Times • Cutting Down Warm-Up Times – SIMICS and Checkpoints • Measurement Interval and How to Determine Performance – Cycles/Transaction • Short Simulation Variability – Average Multiple Trials with Random Memory Latency Times

  5. Even More Details • 3rd Goal: Having Enough Timing Detail • Hard to Simulate for Multiprocessor chips – Need to Simulate every piece of Architecture • Extending SIMICS with 2 Timing Simulators • Memory Simulator – Time-consuming Components are simulated exactly, others are approximated • Detailed Processor Timing – Timing-First Simulation – Implemented with Timing and a Functional Simulator

  6. Evaluation – Case Study Bash

  7. Conclusions • Adapting Simulators usually used in commercial servers for small PCs is hard but possible • Ran into several difficulties including: • Scaling the Workloads while keeping them similar • Measuring Performance for a Multiprocessor System • Keeping Appropriate Timing Detail • The need to look at the Most Relevant workloads

  8. Questions • Is this evaluation technique all that credible without showing a comparison between the real benchmarks running compared to these condensed single PC simulations? • Is using a small set of 4 benchmarks in the Wisconsin Commercial Workload Suite enough? • Could Cloud Computing and cheap clusters be used as an alternative to this small PC simulation approach? Which approach you think research groups would prefer?

More Related