200 likes | 373 Views
Microarchitecture Design Space Exploration Lecture 4. John Cavazos Dept of Computer & Information Sciences University of Delaware www.cis.udel.edu/~cavazos/cisc879. Recent ARM Processor. Increasingly large number of interesting design points. Architecture Simulation.
E N D
Microarchitecture Design Space Exploration Lecture 4 John Cavazos Dept of Computer & Information Sciences University of Delaware www.cis.udel.edu/~cavazos/cisc879
Recent ARM Processor Increasingly large number of interesting design points.
Architecture Simulation • Cycle-accurate simulation • Accurately captures trends in design space • Estimates various metrics (e.g., power, performance) • Challenges with simulation • Accurate simulation very slow • Number of simulations grows very quickly with number of parameters (e.g., cache size, issue width) considered
Why do Predictive Modeling? • Exploring architectural design spaces is hard • Accurate simulation very slow • Number of simulations grows very quickly with number of parameters (e.g., cache size, issue width) considered • With Predictive Modeling • Small number of simulations to train a model, rest of space is predicted • Even smaller number with cross-program prediction!
Speeding up simulation • Reduce Input Sizes • Reduces costs of simulation with smaller inputs • Reduce Instructions Simulated • Sampling of instructions (“hot code”) • Sampled trace from phases • Reduce Simulated Configurations • Sample small number of points from design space
Predictive Modeling • Effectively use sparsely sampled simulated design space • Uses simulated parts of space as training data • Models predict metric of interest (e.g., performance, energy) 1.45
Digression into Regression Suppose you have a set of data (xi,yi) and you want to see if a linear relationship exists between x and y. y = mx + b
Regression with 1 variable Source: http://en.wikipedia.org/wiki/Linear_regression
Linear Regression Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Applying Predictive Models • Inputs • Architecture configuration • Outputs • Metric to predict • E.g., performance relative to a “baseline”
Inputs Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Experimental Methodology Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Model Validation Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Regional Sampling Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Performance Prediction Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Power Prediction Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf
Tools Available • CORE :: Comprehensive Optimization via Regression Estimates • Architecture DSE data sets • Statistical scripts to perform analysis http://www.stanford.edu/~bcclee/software.html
Tools Available (cont’d) • Fusion Predictive Modeling Tools • Tools for application performance prediction • Available upon request http://fusion.csl.cornell.edu/tools/fpmt.html
Conclusions Source: http://www.stanford.edu/~bcclee/documents/lee2006-asplos-slides.pdf