1 / 21

Review of Effectiveness Measures Evaluation Task

Review of Effectiveness Measures Evaluation Task. Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009. EM Task Statement of Work. Develop measures to monitor and predict system engineering effectiveness for DoD Major Defense Acquisition Programs Define SysE effectiveness

andrew
Download Presentation

Review of Effectiveness Measures Evaluation Task

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review of Effectiveness Measures Evaluation Task Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009

  2. EM Task Statement of Work • Develop measures to monitor and predict system engineering effectiveness for DoD Major Defense Acquisition Programs • Define SysE effectiveness • Develop measurement methods for contractors, DoD program managers and PEOs, oversight organizations • For weapons platforms, SoSs, Net-centric services • Recommend continuous process improvement approach • Identify DoD SysE outreach strategy • Consider full range of data sources • Journals, tech reports, org’s (INCOSE, NDIA), DoD studies • Partial examples cited: GAO, SEI, INCOSE, Stevens/IBM • GFI: Excel version of SADB • Deliverables: Report and presentation • Approach, sources, measures, examples, results, recommendations

  3. Target EM Task Benefits for DoD • Identification of best available EM’s for DoD use • Across 3 domains; 3 review levels; planning and execution • Early warning vs. late discovery of SysE effectiveness problems • Identification of current EM capability gaps • Recommendations for most cost-effective enhancements, research on new EM approaches • Ways to combine EM strengths, avoid weaknesses • Foundation for continuous improvement of DoD SysE effectiveness measurement • Knowledge base of evolving EM cost-effectiveness • Improved data for evaluating SysE ROI

  4. Measuring SysE Effectiveness- And measuring SysE effectiveness measures • Good SysE correlates with project success • INCOSE definition of systems engineering, “An interdisciplinary approach and means to enable the realization of successful systems” • Good SysE not a perfect predictor of project success • Project does bad SysE, but gets lucky at the last minute and finds a new COTS solution, producing a great success • Project does great SysE, but poor managers and developers turn it into a disaster • Goodness of a candidate SysE effectiveness measure (EM) • Whether it can detect when a project’s SysE is leading the project more toward success than toward failure • Heuristic for evaluating a proposed SysE EM • Role-play as underbudgeted, short-tenure project manager • Ask “How little can I do and still get a positive rating on this EM?”

  5. Candidate Measurement Methods • NRC Pre-Milestone A & Early-Phase SysE top-20 checklist • Army Probability of Program Success (PoPS) Framework • INCOSE/LMCO/MIT Leading Indicators • Stevens Leading Indicators (new; using SADB root causes) • USC Anchor Point Feasibility Evidence progress • UAH teaming theories • NDIA/SEI capability/challenge criteria • SISAIG Early Warning Indicators/ USC Macro Risk Tool

  6. Independent EM Evaluations and Resolution

  7. Candidate EM Coverage Matrix

  8. EM Evaluation • First cut is to complete “45 x 8” evaluation • Evaluation identifies key criteria / EMs • Preliminary coverage & commonality eval • Four EMs cover more than 1/2 the criteria • Top-20 criteria mentioned by at least 5 EMs • Revise results after team evaluations done

  9. New SE Evaluation Forms

  10. EM Coverage

  11. EM Commonality

  12. Most Mentioned (8)

  13. Most Mentioned (7)

  14. Most Mentioned (6)

  15. Most Mentioned (5)

  16. Test of EM Evaluation Process

  17. New Evaluation Framework • Synthesized from workshop crosswalks • Simpler than SEPP / SISAIG / 45 x 8 • Four major categories • Four to five elements per category • Provides taxonomy of existing frameworks • Coverage spans previous frameworks • Could be basis for new Macro Risk-like tool

  18. Evaluation Framework Criteria

  19. SE Competencies • Developed by ODNI • Comprehensive survey of core competencies • 10 candidate work activities • 173 candidate knowledge, skills & abilities (KSAs) • To our knowledge, not yet validated • Approved for limited release within SERC

  20. SE Competency Sample

  21. Q & A

More Related