210 likes | 502 Views
Review of Effectiveness Measures Evaluation Task. Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009. EM Task Statement of Work. Develop measures to monitor and predict system engineering effectiveness for DoD Major Defense Acquisition Programs Define SysE effectiveness
E N D
Review of Effectiveness Measures Evaluation Task Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009
EM Task Statement of Work • Develop measures to monitor and predict system engineering effectiveness for DoD Major Defense Acquisition Programs • Define SysE effectiveness • Develop measurement methods for contractors, DoD program managers and PEOs, oversight organizations • For weapons platforms, SoSs, Net-centric services • Recommend continuous process improvement approach • Identify DoD SysE outreach strategy • Consider full range of data sources • Journals, tech reports, org’s (INCOSE, NDIA), DoD studies • Partial examples cited: GAO, SEI, INCOSE, Stevens/IBM • GFI: Excel version of SADB • Deliverables: Report and presentation • Approach, sources, measures, examples, results, recommendations
Target EM Task Benefits for DoD • Identification of best available EM’s for DoD use • Across 3 domains; 3 review levels; planning and execution • Early warning vs. late discovery of SysE effectiveness problems • Identification of current EM capability gaps • Recommendations for most cost-effective enhancements, research on new EM approaches • Ways to combine EM strengths, avoid weaknesses • Foundation for continuous improvement of DoD SysE effectiveness measurement • Knowledge base of evolving EM cost-effectiveness • Improved data for evaluating SysE ROI
Measuring SysE Effectiveness- And measuring SysE effectiveness measures • Good SysE correlates with project success • INCOSE definition of systems engineering, “An interdisciplinary approach and means to enable the realization of successful systems” • Good SysE not a perfect predictor of project success • Project does bad SysE, but gets lucky at the last minute and finds a new COTS solution, producing a great success • Project does great SysE, but poor managers and developers turn it into a disaster • Goodness of a candidate SysE effectiveness measure (EM) • Whether it can detect when a project’s SysE is leading the project more toward success than toward failure • Heuristic for evaluating a proposed SysE EM • Role-play as underbudgeted, short-tenure project manager • Ask “How little can I do and still get a positive rating on this EM?”
Candidate Measurement Methods • NRC Pre-Milestone A & Early-Phase SysE top-20 checklist • Army Probability of Program Success (PoPS) Framework • INCOSE/LMCO/MIT Leading Indicators • Stevens Leading Indicators (new; using SADB root causes) • USC Anchor Point Feasibility Evidence progress • UAH teaming theories • NDIA/SEI capability/challenge criteria • SISAIG Early Warning Indicators/ USC Macro Risk Tool
EM Evaluation • First cut is to complete “45 x 8” evaluation • Evaluation identifies key criteria / EMs • Preliminary coverage & commonality eval • Four EMs cover more than 1/2 the criteria • Top-20 criteria mentioned by at least 5 EMs • Revise results after team evaluations done
New Evaluation Framework • Synthesized from workshop crosswalks • Simpler than SEPP / SISAIG / 45 x 8 • Four major categories • Four to five elements per category • Provides taxonomy of existing frameworks • Coverage spans previous frameworks • Could be basis for new Macro Risk-like tool
SE Competencies • Developed by ODNI • Comprehensive survey of core competencies • 10 candidate work activities • 173 candidate knowledge, skills & abilities (KSAs) • To our knowledge, not yet validated • Approved for limited release within SERC