730 likes | 912 Views
CMMI 2. Objectives. Present the background and current status of CMM IntegrationDiscuss structure and sample content of the new maturity modelsPresent timeline for public release of the models and pilot assessmentsDiscuss transition from the current maturity models and assessment methods. CMMI 3.
E N D
1. CMMI 1 Capability Maturity Model® Integration (CMMISM)Presentation to Omaha SPINDec 2, 1999Bob Rassa, RaytheonDeputy Chair, CMMI Project
2. CMMI 2 Objectives Present the background and current status of CMM Integration
Discuss structure and sample content of the new maturity models
Present timeline for public release of the models and pilot assessments
Discuss transition from the current maturity models and assessment methods
3. CMMI 3 Agenda Introduction
Background
Design Approach
Comparison to SW-CMM v1.1
Comparison to EIA IS 731 (SECM)
Assessment Methodology
Comment Process
Transition Process
Discussion
4. CMMI 4 Background Objectives
Review CMMI objectives
Review key requirements and source material
Discuss CMMI project
Compare and contrast current maturity models
CMM for Software, SECM, IPD-CMM
Staged, continuous
Introduce CMMI terminology
5. CMMI 5 What are Capability Maturity Models? Organized collections of best practices
Based on work by Crosby, Deming, Juran, Humphrey...
Systematic ordered approach to process improvement.
Means of measuring organizational maturity.
Have proven to bring significant return on investment in productivity and quality. To start off, a very brief CMM-101. CMMs grew out of the quality improvement work in the 70’s and 80’s
First and foremost, CMMs are used for a systematic, ordered…
Systematic because there is an experience-based rationale for improvement.
Ordered in that the improvement follows well defined steps, with initial practices that provide the foundation for advanced practices and so forth.
Because of this ordering, the CMM provides a means of measuring the maturity of an organization.
The significant ROI is a key issue - DoD acquisition’s “stick and carrot” approach is not the only reason CMM-based improvement has been so widespread.
To start off, a very brief CMM-101. CMMs grew out of the quality improvement work in the 70’s and 80’s
First and foremost, CMMs are used for a systematic, ordered…
Systematic because there is an experience-based rationale for improvement.
Ordered in that the improvement follows well defined steps, with initial practices that provide the foundation for advanced practices and so forth.
Because of this ordering, the CMM provides a means of measuring the maturity of an organization.
The significant ROI is a key issue - DoD acquisition’s “stick and carrot” approach is not the only reason CMM-based improvement has been so widespread.
6. CMMI 6 How are CMMs used? Process Improvement
Process Definition
Competency Assessment
Risk Management
Communication PI is the primary focus of CMMs
New organizations, or organizations that are re-engineering their processes can use CMMs to define new processes
CMMs provide guidance in developing competency areas (FAA)
CMMs can help identify strengths and weaknesses in organizations for risk management in subcontracting, outsourcing, acquisitions, and teaming arrangementsPI is the primary focus of CMMs
New organizations, or organizations that are re-engineering their processes can use CMMs to define new processes
CMMs provide guidance in developing competency areas (FAA)
CMMs can help identify strengths and weaknesses in organizations for risk management in subcontracting, outsourcing, acquisitions, and teaming arrangements
7. CMMI 7 The Current Situation - every silver lining has a dark cloud Explosion of CMMs and CMM-like models
Multiple models within an organization
<Transition> Basically, we have been victims of the CMM’s success. Because the idea of an ordered way of improving processes is an appealing next step after TQM or other quality-focused improvement strategies. Everyone seems to have decided that they could use a CMM for their particular discipline, so…
some <need latest number> of CMMs or CMM-like models have been created over the past few years. This leads to a concern in organizations who have multiple disciplines using multiple - and perhaps incompatible - models. This requires multiple assessments using different criteria or methods, multiple training to address different models or approaches, and most importantly, higher costs because of the duplication of effort.
Currently, software and systems engineering seem to be the most frequently shared, but other situations have been documented and the problem seems to be getting worse.
<transition> Now, given that all of these models are based on the same concepts, ….<Transition> Basically, we have been victims of the CMM’s success. Because the idea of an ordered way of improving processes is an appealing next step after TQM or other quality-focused improvement strategies. Everyone seems to have decided that they could use a CMM for their particular discipline, so…
some <need latest number> of CMMs or CMM-like models have been created over the past few years. This leads to a concern in organizations who have multiple disciplines using multiple - and perhaps incompatible - models. This requires multiple assessments using different criteria or methods, multiple training to address different models or approaches, and most importantly, higher costs because of the duplication of effort.
Currently, software and systems engineering seem to be the most frequently shared, but other situations have been documented and the problem seems to be getting worse.
<transition> Now, given that all of these models are based on the same concepts, ….
8. CMMI 8 Another view of the situation is shown here in a representation of all the various frameworks and how they relate - or don’t.Another view of the situation is shown here in a representation of all the various frameworks and how they relate - or don’t.
9. CMMI 9 Why is This a Problem? Similar process improvement concepts, but...
Different model representations (e.g. staged, continuous, questionnaire, hybrid)
Different terminology
Different content
Different conclusions
Different appraisal methods It is true that the models all use similar process improvement concepts, but…
Different model representations are the biggest issue - as we will see later.
<transition>So lets review how CMM based-improvement works….It is true that the models all use similar process improvement concepts, but…
Different model representations are the biggest issue - as we will see later.
<transition>So lets review how CMM based-improvement works….
10. CMMI 10 Improvement in any discipline is a function of performing:
Implementing practices that reflect the fundamentals of a particular topic (e.g. configuration management)
Institutionalizing practices that lead to sustainment and improvement of an implementation The Silver Lining - each model shares a common basis for process improvement Implementing practices are what you do to produce your product or service and what you do to manage and control your organization
Institutionalizing practices are things that mature your performance of the implementing practices that is, increase predictability, quality, and flexibility.Implementing practices are what you do to produce your product or service and what you do to manage and control your organization
Institutionalizing practices are things that mature your performance of the implementing practices that is, increase predictability, quality, and flexibility.
11. CMMI 11 Thus all CMMI source models contain: Implementing practices grouped by affinity
Institutionalizing practices that vary from model to model, however all models specify levels that describe increasing capability to perform But, although the models share these same components, the institutionalizing practices are organized and presented differently in the different models.But, although the models share these same components, the institutionalizing practices are organized and presented differently in the different models.
12. CMMI 12 Improvement Levels And, all models have a five step improvement path with the same characteristics for each step, however they apply the steps differently.And, all models have a five step improvement path with the same characteristics for each step, however they apply the steps differently.
13. CMMI 13 The CMMI Project DoD and NDIA sponsored
Collaborative endeavor
Industry (NDIA Systems Engineering Committee)
Government (OSD plus Services)
SEI
Over 100 people involved
So, based on the problem and an understanding of the fundamental similarities of CMMs and CMM-based improvement methods, the CMMI project was born!So, based on the problem and an understanding of the fundamental similarities of CMMs and CMM-based improvement methods, the CMMI project was born!
14. CMMI 14
15. CMMI 15 The CMMI Development Team U.S. Air Force
U.S. Navy
Federal Aviation Administration
National Security Agency
Software Engineering Institute (SEI)
ADP, Inc.
Boeing
Computer Sciences Corp. Ericsson Canada
General Dynamics
Honeywell
Litton
Lockheed Martin
Marconi
Northrop Grumman
Pacific Bell
Raytheon
Rockwell Collins
Thomson CSF
TRW The CMMI Development team consists of model and domain experts from U.S. and international defense industries, commercial business firms, international telecommunications companies, and U.S. DoD and other federal agencies. The CMMI Development team consists of model and domain experts from U.S. and international defense industries, commercial business firms, international telecommunications companies, and U.S. DoD and other federal agencies.
16. CMMI 16 Integrate the models, eliminate inconsistencies, reduce duplication
Reduce the cost of implementing model-based process improvement
Increase clarity and understanding
Common terminology
Consistent style
Uniform construction rules
Common components
Assure consistency with ISO 15504
Be sensitive to impact on legacy efforts
17. CMMI 17 Benefits Efficient, effective assessment and improvement across multiple process disciplines in an organization
Reduced training and assessment costs
A common, integrated vision of improvement for all elements of an organization
A means of representing new discipline-specific information in a standard, proven process improvement context
18. CMMI 18 The Challenge Extract the common or best features from the source models
Provide users the ability to produce single- or multiple-discipline models, both continuous and staged, tailored to their organizational needs.
Provide users the ability to assess and train based on these models.
19. CMMI 19 CMMI Source Models Capability Maturity Model for Software V2, draft C (SW-CMM V2C)
EIA Interim Standard 731, System Engineering Capability Model (SECM)
Integrated Product Development Capability Maturity Model, draft V0.98 (IPD-CMM)
20. CMMI 20 Source Model Terminology Although the source models contain many of the same components, the terminology and organization were different. The next slides describe the characteristics of Staged and Continuous representations and present the terminology chosen by the CMMI Project for each CMMI model output representation.
In EIA IS 731, Advanced Practices are Specific Practices at higher Capability Levels. These are not present in the SW-CMM, nor are Generic Attributes, which are used to rate the effectiveness of the process in use.Although the source models contain many of the same components, the terminology and organization were different. The next slides describe the characteristics of Staged and Continuous representations and present the terminology chosen by the CMMI Project for each CMMI model output representation.
In EIA IS 731, Advanced Practices are Specific Practices at higher Capability Levels. These are not present in the SW-CMM, nor are Generic Attributes, which are used to rate the effectiveness of the process in use.
21. CMMI 21 Staged Representations Key Process Areas are grouped in the stages (levels) from 2 to 5
Each Key Process Area contains implementing practices (activities) to achieve the purpose of the process area.
For a Key Process Area at a given stage, institutionalizing practices are integral to the process area.
Since a Staged model groups KPAs in stages, an organization’s Maturity Level can be determined by measuring satisfaction of all KPAs in a given stage.
Each KPA contains both implementation practices (Activities) and institutionalization practices (practices in Commitment to Perform, Ability to Perform, Measurement and Analysis, and Verifying Implementation Common Features).Since a Staged model groups KPAs in stages, an organization’s Maturity Level can be determined by measuring satisfaction of all KPAs in a given stage.
Each KPA contains both implementation practices (Activities) and institutionalization practices (practices in Commitment to Perform, Ability to Perform, Measurement and Analysis, and Verifying Implementation Common Features).
22. CMMI 22 Staged ModelSW-CMM V2.0 draft C
23. CMMI 23 Continuous Representations A process area contains Specific Practices to achieve the purpose of the Process Area. Some of these practices may reside at higher Capability Levels (Advanced Practices)
Generic Practices are grouped to define Capability Levels
Generic practices are added to the Specific Practices of each Process Area to attain a Capability Level for the Process Area.
The order in which Process Areas are addressed can follow a recommended staging. In a continuous representation, process areas contain the implementation practices, (Specific Practices). Institutionalizing practices (Generic Practices) are then used to mature the process area, or focus area. Thus each process area can mature from Capability Level 1 to 5, and is rated independently.In a continuous representation, process areas contain the implementation practices, (Specific Practices). Institutionalizing practices (Generic Practices) are then used to mature the process area, or focus area. Thus each process area can mature from Capability Level 1 to 5, and is rated independently.
24. CMMI 24 Each process area can be rated from CL 1 to CL 5. CL 1 means the implementing practices are performed, while CL 2 thru 5 reflect institutionalization via Generic Practices.Each process area can be rated from CL 1 to CL 5. CL 1 means the implementing practices are performed, while CL 2 thru 5 reflect institutionalization via Generic Practices.
25. CMMI 25 CMMI Model Terminology
26. CMMI 26 Assessment Methods CBA IPI Method
Rating of goals
Single digit rating
Full goal satisfaction
More strict data validation requirement SECM Assessment Method
Rating of practices
Granularity options
Partial credit options
Less strict data validation requirement
27. CMMI 27 The CMMI Challenge Integrate three source models that have many differences
Provide consistency with ISO 15504
Maintain support from user communities
Develop framework to allow growth to other disciplines
28. CMMI 28 Design Approach Objectives
Review design goals
Discuss framework for CMMI
Describe CMMI components
Outline CMMI products
Discuss CMMI Schedule and current issues
29. CMMI 29
30. CMMI 30 The CMMI Product Line
31. CMMI 31 CMMI Product Suite
32. CMMI 32 Framework Components
Construction rules
Conceptual architecture
33. CMMI 33 The CMMI Framework
34. CMMI 34 CMMI V0.2 Staged Process Areas Maturity Level 2 Process Management Core Engineering (SE & SW)
Project Planning Requirements Management
Project Monitoring and Control
Configuration Management
Process & Product Quality Assurance
Supplier Agreement Management
Data Management
Measurement & Analysis
35. CMMI 35 CMMI V0.2 Staged Process Areas Maturity Level 3 Process Management Core
Organizational Process Focus
Organizational Process Definition
Organizational Training
Integrated Project Management
Risk Management
Decision Analysis & Resolution Engineering (SE & SW)
Customer & Product Requirements
Technical Solution
Product Integration
Product Verification
Validation
36. CMMI 36 CMMI V0.2 Staged Process AreasMaturity Levels 4 & 5 Process Management Core
Quantitative Management of Quality and Process
Organizational Process Performance
Causal Analysis and Resolution
Organizational Process Technology Innovation
Process Innovation Deployment
37. CMMI 37 CMMI Products CMMI Models
Assessment Material
Training Material
Model Developer Material
38. CMMI 38 Assessment Material Assessment requirements
Assessment methodology
Assessment data collection methods and tools (e.g., questionnaires, interviews)
Assessment Team qualifications
39. CMMI 39 Training Material Model Training
Assessment Training
Team Training
Lead Assessor Training
40. CMMI 40 Model Developer Material Glossary
Framework and model content criteria
Framework Training
41. CMMI 41 CMMI Schedule August 31, 1999 Release CMMI-SE/SW V0.2 for public review.
Dec 15, 1999 Release CMMI-SE/SW/IPPD for public review
Nov 1999-May 2000 Pilot assessments
Jun-Aug 2000 Publish models V1.0
42. CMMI 42 CMMI-SE/SW compared to SW-CMM v1.1 Objectives
Describe Background
Discuss Model Component Comparison
Process Areas
Common Features
43. CMMI 43 Background - 1 SEI had completed updates to the SW-CMM when the CMMI project was started
SW-CMM v2 Draft C was used as the source model for CMMI
Adapted for compatibility with SE
Most of the community is currently using SW-CMM v1.1
Detailed traceability matrices are being developed
44. CMMI 44 Background - 2 CMMI- SE/SW staged representation is similar to SW-CMM v1.1
Maturity Levels composed of Process Areas
Goals are required; implemented & institutionalized
Practices are expected; alternative practices are acceptable if effective at meeting the goals
All else is informative
CMMI- SE/SW continuous representation reflects the same info in a SPICE-like structure
45. CMMI 45 SW-CMM v1.1 CMMI
46. Software Product EngineeringSW-CMM v1.1 Activities 1 Appropriate software engineering methods and tools are integrated into the project's defined software process.
2 The software requirements are developed, maintained, documented, and verified by systematically analyzing the allocated requirements according to the project's defined software process.
3 The software design is developed, maintained, documented, and verified, according to the project's defined software process, to accommodate the software requirements and to form the framework for coding.
4 The software code is developed, maintained, documented, and verified, according to the project's defined software process, to implement the software requirements and software design.
5 Software testing is performed according to the project's defined software process.
47. Software Product EngineeringSW-CMM v1.1 Activities (continued) 6 Integration testing of the software is planned and performed according to the project's defined software process.
7 System and acceptance testing of the software are planned and performed to demonstrate that the software satisfies its requirements.
8 The documentation that will be used to operate and maintain the software is developed and maintained according to the project's defined software process.
9 Data on defects identified in peer reviews and testing are collected and analyzed according to the project's defined software process.
10 Consistency is maintained across software work products, including the software plans, process descriptions, allocated requirements, software requirements, software design, code, test plans, and test procedures.
48. CMMI 48 Common Feature Comparison Differences in the Common Features include:
Process planning moved from an Activity to the Ability to Perform Common Feature, since it is generic to all Process Areas.
New Common Feature - Directing Implementation, contains a new practice - Manage Configurations since this practice should be generic to all Process Areas. Details on this practice are found in the Configuration Management Process Area. This new Common Feature also contains the practice to Monitor and Control the Process, which was an Activity in SW-CMM v1.1
The result is that Activities performed contains only activities, to match Specific Practices in the continuous representation.
Measurement and Analysis moved from a Common Feature to its own Process Area.
Verifying Implementation simplified to have only one management review practice.Differences in the Common Features include:
Process planning moved from an Activity to the Ability to Perform Common Feature, since it is generic to all Process Areas.
New Common Feature - Directing Implementation, contains a new practice - Manage Configurations since this practice should be generic to all Process Areas. Details on this practice are found in the Configuration Management Process Area. This new Common Feature also contains the practice to Monitor and Control the Process, which was an Activity in SW-CMM v1.1
The result is that Activities performed contains only activities, to match Specific Practices in the continuous representation.
Measurement and Analysis moved from a Common Feature to its own Process Area.
Verifying Implementation simplified to have only one management review practice.
49. CMMI 49 Conclusions Organizations using SW-CMM v1.1 should be able to smoothly transition to CMMI, accommodating the following changes:
Measurement and Analysis & Data Mgmt at L2
Risk Management & Decision Analysis and Resolution at L3
Expansion of Software Product Engineering
Configuration Management for all Process Areas
50. CMMI 50 Comparing CMMI-SE/SW to EIA IS 731-SECM Objectives
Background
Process Area Comparison
Planned IPPD Extensions
51. CMMI 51 Background EIA 731 was created as a merger of the SE-CMM and INCOSE SECM models
Used as a source model for CMMI
CMMI-SE/SW merges software ideas
Staged representation of SE available
Continuous representation with “equivalent staging”
52. CMMI 52 Comparison of Elements
53. CMMI 53 Comparison of Elements (cont’d)
54. CMMI 54 Comparison of Elements (cont’d)
55. CMMI 55 Conclusions EIA 731 users should be able to smoothly transition to the CMMI-SE/SW model
Continuous representation (+ “equivalent” staged representation)
Some lower level differences
Integrated Product and Process Development (IPPD) will be added
Based on IPD-CMM and other practices
56. CMMI 56 Summary - Inherited Features from Source Models SW CMM
Goals normative
Informative material in Vol II
731
Practices/activities mapped to Goals
Advanced practices (technical PAs only)–
Added practices to goals as levels increase
Generic Attributes (optional in pilots) –
Effectiveness of processes
Value of products
57. CMMI 57 Assessment Methodology Objectives
Assessment approach
Assessment Requirements for CMMI (ARC)
SCAMPI assessment method
Lead Assessor program, transition plan
58. CMMI 58 Assessment Methods CBA IPI Method
Rating of goals
Single digit rating
Full goal satisfaction
More strict data validation requirement SECM Method
Rating of practices
Granularity options
Partial credit options
Less strict data validation requirement
59. CMMI 59 Assessment Requirements for CMMI (ARC) Similar to the current CMM Appraisal Framework (CAF) V1.0
Specifies the minimum requirements for full, comprehensive assessment methods, e.g., SCAMPI
Other assessment methods may be defined for situations not requiring a comprehensive assessment
initial assessment, quick-look, process improvement monitoring, etc.
60. CMMI 60 Standard CMMI Assessment Method for Process Improvement (SCAMPI) Similar to CBA IPI method
Led by authorized Lead Assessor
Tailorable to organization and model scope
Artifacts:
SCAMPI Method description document
Maturity questionnaire, work aids, templates
Current activities
Merger of SECM appraisal method features
61. CMMI 61 CMMI Lead Assessor Program Similar to existing SEI Lead Assessorand Lead Evaluator programs
Grandfather current Lead Assessors
Under consideration
Delineate by discipline, e.g., SW Lead Assessors, SE Lead Assessors?
Details of transition process for current Lead Assessors and other assessment leaders
Required training in CMMI models
62. CMMI 62 Comment Process Release CMMI-SE/SW v0.2 August 31
Available at http://www.sei.cmu.edu
Public comments due November 30
Release CMMI-SE/SW/IPPD December 1
Comments due February 28
Hold Focus Group discussions
SEI Transition
Assessors for both communities
SPINs
63. CMMI 63 Nine initial assessments (desired)
Supported by 3 Product Development Team (PDT) members,
Covering all CMMI models, staged and continuous representations
Product Development Team (PDT) member roles
CMMI Product Suite Training
Coaching and structured observation
Structured feedback from assessment participants
Assessors and Sponsors, and
Participating organization members Pilot Assessments Full scale assessments?
Similar to current CBA IPI
Other assessment types to follow -- anticipated method development by others than PDT, using CMMI assessment method framework
Model coverage? TBD
May limit to organizations with prior CMM experience -- to mitigate risk of subjecting novices to not-yet-fully-cooked models; and to ensure availability of authorized lead assessors
Depending on applications, may focus first/more on a subset of the six models -- not necessarily two for each of the six modelsFull scale assessments?
Similar to current CBA IPI
Other assessment types to follow -- anticipated method development by others than PDT, using CMMI assessment method framework
Model coverage? TBD
May limit to organizations with prior CMM experience -- to mitigate risk of subjecting novices to not-yet-fully-cooked models; and to ensure availability of authorized lead assessors
Depending on applications, may focus first/more on a subset of the six models -- not necessarily two for each of the six models
64. CMMI 64 CMMI Transition Plan Development Phase
Development of CMMI products
Verification and Validation of CMMI products
Transition Phase
Approval of a CMMI Product for public release
Evidence of sufficient use
Transition planning to help organizations use CMMI products
Sustainment Phase
Upkeep & continuous improvement of the product suite
Additional evidence of adoption and use
65. CMMI 65 Transitioning to Use of CMMI Understand how models are used:
Steps to enterprise-wide process improvement
Apply Lessons Learned in transitioning from single-discipline models
Federal Aviation Administration’s experiences with iCMM
US Air Force experiences with transitioning between models
Others…
Perform gap analysis between current processes and CMMI
66. CMMI 66 Steps to Enterprise-wide Organizational Maturity
67. CMMI 67 CMMI Benefits CMMI product users can expect to:
Efficiently and effectively improve and assess multiple disciplines across their organization
Reduce costs (including training) associated with improving and assessing processes
Deploy a common, integrated vision of process improvement that can be used as a basis for enterprise-wide process improvement efforts.
68. CMMI 68 CMMI team is working to assure the CMMI Product Suite addresses needs of software and systems engineering communities of practice
Use of an integrated model to guide enterprise process improvement promises to be one of the more sustainable & profitable initiatives that any organization might pursue The promise...
69. CMMI 69 Current Status
70. CMMI 70 CMMI Steering Group Phil Babel, USAF -retiring Dec 30 (Ajmel Dulai, replacement) - Chair
Bob Rassa, Raytheon - Deputy Chair
Tom Parry, OSD
Clyde Chittister, SEI
Bill Peterson, SEI
Bob Lentz, General Dynamics
Hal Wilson, Litton-PRC
Joan Weszka, Lockheed Martin
Mike Devine, USA
Dave McConnell, USN
Linda Ibrahim, FAA
Motorola Representative (for commercial industry)
71. CMMI 71 And finally, a few statistics---
72. CMMI 72 And Food For Thought SW-CMM and EIA/IS-731 will be “sunset” 2 years after final release of CMMI (now scheduled for August 2000)
Time to start planning for CMMI!
CMMI Training Courses have begun at the SEI
1st one conducted November 1999
Scheduled quarterly through 2000
There is still room on the Development Team
CMMI will transition to Sustainment Mode after Summer 2000
SEI providing User Support as they did for SW-CMM
We are expanding the ranks of “Transition Partners”
Seven (7) used on SW-CMM, many more for CMMI
73. CMMI 73 Quote from Pilot Training attendee “There is a lot more positive than negative in this model. We don’t need to wait 2-3 years to adopt. There is an amazing amount of value. Just the added scope is great. We mapped different models against the organization’s processes. SW-CMM covers 1/3 of it. SE covers 75% and IPPD 85%. The difference in scope is amazing. We need to endorse this model. What we say is what will happen.”