210 likes | 302 Views
Issues in the Validation of Battle Models. Presented at 19 ISMOR David Frankis ‘The Barbican’, East Street, Farnham, Surrey GU9 7TB 01252 738500 www.Advantage-Business.co.uk August 2002. Acknowledgements. Dstl
E N D
Issues in the Validation of Battle Models Presented at 19 ISMOR David Frankis ‘The Barbican’, East Street, Farnham, Surrey GU9 7TB 01252 738500 www.Advantage-Business.co.uk August 2002
Acknowledgements • Dstl • This work was carried out under contract to Dstl by Advantage Technical Consulting • RMCS
Today’s Presentation • Why validation • CLARION • What was done • Issues raised • Questions
Why Validation? • UK Government decision-making must pass the test of independent scrutiny • Making a logical case based on credible information is key to this • OA claims to be able to support this by quantifying key aspects, objectively • The validity of this quantification is therefore crucial • A new version of CLARION required an update to its validation status
CLARION General • A Land-Air campaign model • Object Oriented C++ implementation • Functionality is based on the concept of missions: • Each entity (e.g. a division) has a mission • Subordinate units are tasked with missions based on the superior’s mission • Defined set of mission types • Generally Brigade level and above
CLARION Functionality • Movement and Attrition • Command • Communications • Sensing • Close combat, Arty, Recce, Helo • Some Air aspects • CBW • EW • No logistics in version examined (V3.0)
What is Validation? • The model is realistic? • The representation of internal processes is correct? • Known effects are covered? • Sufficient detail is included? • The results are plausible? • Conclusions drawn are substantiated?
Other Validation Issues • Scope of validation • Model only, or ancillary tools • Status of any comparison • Danger of mutually-supporting invalid models
Validation Activities • Prioritisation of Requirement • Selection of Comparison Method • Generation of Scenario • Comparison Activity • Analysis and Reporting
Prioritisation • CLARION has wide scope of functions and contexts • Key stakeholders were consulted for their views • Formal method used to prioritise • Main outcome: focus on mainstream uses, not functions less used (Air, EW, CBW)
Selection of method • Possible comparison approaches • Historical Analysis • Trials and Exercises • Other models • Military (and analytical) Judgement • Wargame • These are not mutually exclusive • Wargame was selected as best approach at a workshop • Dstl staff selected most appropriate (commercial) game
Scenario Generation • Workshop held with scientific and military analysts • Fictitious scenario overlaid on a map • Outline scheme of manoeuvre developed
Comparison and Analysis • Scenario entered in CLARION • Adjusted with military input • Then into wargame and played • Further CLARION adjustment to reflect military intentions in wargame • Comparison of outputs • Some practical difficulties arising from wargame limitations
Findings • Validation as part of study process • Data adjustment • User interface issues • Extraneous effects
Validation as Part of Study Process • Ideally, the data and the way the model is used requires (re-)validation on each study • Validation is an iterative process • How much? • What if the iteration doesn’t converge? • In exceptional cases, could have independent teams of analysts
validate Wargame Process Elements Selection of Scenario Scheme of manoeuvre CLARION input Interpret outputs Exercise in CLARION Study conclusions
User Interface Issues • If the user interface is unfriendly or unintuitive, analysts will lack confidence • Longer learning curve for new analysts and scrutineers • Resulting loss of confidence in results through uncertainty and reduced effective validation effort
Data Adjustment • In order to capture effects not explicit in the model, analysts adjust the input data • Acceptable as long as analysts doing the adjustment are doing the reporting • Legacy effects • Unpredictable interactions when done more than once
Extraneous Effects • CLARION scenarios are acknowledged to develop much more quickly than reality • As long as all processes (movement, attrition, communication) are accelerated the same for both sides, does not matter for many study purposes • BUT study results are easy to rubbish because they seem to have low credibility
Conclusions: General • Model unlikely to be the limiting factor on confidence in study results • The use of a good model cannot compensate for a poor process or the use of insufficiently skilled analysts • Where studies focus on scenarios, they, and their data, should be validated for that study • Consider use of a wargame tool to support the development of a scheme of manoeuvre in campaign studies
Conclusions: Process Elements • Treat input data collection and refinement as integral to the study, not a necessary evil • Iterate the review of input data, output results, and the use of adjunct tools to converge on a ‘valid enough’ solution • Ensure the military plan remains valid when conducting sensitivity excursions • For major studies, consider some parallel working • Use different experts at different stages to ensure freshness of perspective