230 likes | 454 Views
Validation & Verification. Chapter 10. VALIDATION & VERIFICATION. Very Difficult Very Important Conceptually distinct, but performed simultaneously You must be sure your model is correct Your client must be confident that your model is correct Should be an integral part of model building.
E N D
Validation & Verification Chapter 10
VALIDATION & VERIFICATION • Very Difficult • Very Important • Conceptually distinct, but performed simultaneously • You must be sure your model is correct • Your client must be confident that your model is correct • Should be an integral part of model building
VALIDATION • Goals • Produce a model that represents system behavior closely enough to be a substitute for the system for experimentation • Analyzing & predicting performance • Increase credibility of model to managers & decision makers
DEFINITIONS • Verification • Ensures that the model developed is correctly implemented in the software • Validation • Ensures that the model accurately represents the real-world system
Validation & Verification Process • An integral part of model design & implementation process – not separate • Most methods are informal or heuristic in nature • Experience in model development, simulation programming, & the system are essential
MODEL BUILDING • Observation of real system • Collect data • Talk to members of system • Construct conceptual model • Assumptions about components & structure of system • Hypothesis – values of input parameters • Implement operational model • Usually using simulation software **Not linear process, iterative!!
SUGGESTIONS FOR VERIFICATION • Operational model checked by simulation software expert – not developer • Flow diagram for each possible action • Examine output for reasonableness under various inputs – use wide variety of output statistics • Print input at end of run to ensure stability • DOCUMENTATION!!! • Ensure animation of model is correct
SUGGESTIONS FOR VERIFICATION • Use debugger of interactive run controller (IRC) – advantages • Can monitor simulation progress & display results • Can focus on single line or process • Can observe model components & variables • Can pause; reassign values • GUI – always recommended ** Basic Software Engineering Principles
Suggestion 3 • Examine output for reasonableness • Calculate expected results • Vary input • Ask users to review • Examples Suppose multiple servers & only look at throughput. Maybe too many went to one server & too few to the other. If priority queue, are they actually processed in correct order.
Suggestion 3 (cont’d.) • Utilize standard output from simulation environments • Current Count & Total Count are important variables • Consider predictions • Mathematical (e.g. Utilization) • Experts • Perform a Trace • Snapshots • By hand
CALIBRATION & VALIDATION • Validation – comparing model to system • Calibration – iterative process of comparing model to real system & adjusting the model – repeat! • Comparisons • Subjective – experts review • Objective – use of data & results
VALIDATION • Never truly completely validated • Model never totally represents the real system • Be sure model is not “fit” to one set of data
3-Step Approach to Validation Naylor & Finger [1967] • Build a model with high face validity • Validate model assumptions • Compare model I/O transformations to corresponding I/O transformations for the real system
Step 1. FACE VALIDITY • Construct a model that seems valid to the users/experts knowledgeable with system • Include users in calibration – builds perceived credibility • Sensitivity Analysis – change one or more input value & examine change in results – Are results consistent with real system? • Choose most critical variables to reduce cost of experimentation
Step 2. Validation of Assumptions • 2 categories of assumptions • Structural assumptions • Data Assumptions • Structural Assumptions • Involves how system operates • Includes simplifications & abstractions of reality • Data Assumptions • Based on data collection & statistical analysis
Step 2. Validation of Assumptions (contd) • Review – Analysis of Data • Identify probability distribution • Estimate parameters of distribution • Perform goodness-of-fit test • Chi Square, Kolmogorov-Smirnov tests • Test homogeneity of data • Are two independently collected sets of data come from the same parent population?
Step3. Validating I/O Transformations • Ultimate Test of a Model • Ability to predict the future behavior of the real system • Model viewed as an I/O Transformation • Can also us historical data to test prediction ability • Note: If main purpose of simulation changes, model should be revalidated
Step 3. Validating (cont’d) • Models are often used for comparing alternate system designs • Minor changes in parameters • IA rate, # servers • Minor change in statistical distribution • Major change in logical structure of subsystem • Queue discipline • Major design change • Manual to automated system
Using Historical Input Data • An alternative to randomly generated data – don’t mix different data sets • File, Spreadsheet, or Database • {A1, A2,…,An} & {S1, S2,…Sn} • Feed data into the FEL • Compare output to what happened in the real system • May be able to use technology to collect historical data for use
I/O Validation – Turing Test • What is the Turing Test? • Generate 5 “fake” reports from simulation & mix with 5 real reports; ask experts if they can distinguish fake from real • If cannot, then pass Turing Test!
Validation Techniques In order of cost-to-value ratio – Van Horn (1969, 1971) 1. Develop model with high face validity by including experts, previous research, studies, observation, experience 2. Test input data for homogeneity, randomness, goodness-of-fit 3. Turing test – use knowledgeable people
Validation Techniques (cont’d) 4. Compare model & system output using statistical tests 5. After model development, collect new data & repeat steps 2 to 4 6. Build new system or redesign old one, collect data on new system & use to validate model (not recommended) 7. Do little or no validation. Implement. (not recommended)
Conclusion • Do not become obsessed with validation & verification to the detriment of progress; causing excessive cost • Modeler should select techniques most valuable and appropriate to the particular system being modeled and the goals of the project • Validate & Verify to assure Accuracy & Credibility