710 likes | 734 Views
Capers Jones & Associates LLC. SOFTWARE PROCESS ASSESSMENTS: HISTORY AND USAGE. Capers Jones, President and Chief Scientist Emeritus of Software Productivity Research LLC Quality seminar: talk 3. Web: www.SPR.com Email:Capers.Jones3@gmail.com. June 11, 2011.
E N D
Capers Jones & Associates LLC SOFTWARE PROCESS ASSESSMENTS: HISTORY AND USAGE Capers Jones, President and Chief Scientist Emeritus of Software Productivity Research LLC Quality seminar: talk 3 Web: www.SPR.com Email:Capers.Jones3@gmail.com June 11, 2011
ORIGINS OF SOFTWARE ASSESSMENTS • IBM began internal software assessments circa 1970 • Watts Humphrey - IBM East Coast • Ron Radice - IBM East Coast • Mike Fagan - IBM East Coast • James H. Frame - IBM West Coast • Capers Jones - IBM West Coast • IBM assessment experts moved to other companies circa 1975 • ITT adopted IBM assessments circa 1978 • James H. Frame - IBM to ITT Vice President • Capers Jones - IBM to ITT to SPR • Dr. Bill Curtis - ITT to SEI • Watts Humphrey - IBM to SEI • SEI assessments derived from IBM East Coast methods • SPR assessments derived from IBM West Coast methods
ORIGINS OF SOFTWARE ASSESSMENTS • IBM East Coast assessments circa 1975: • Used for large operating systems (MVS) • Triggered by IBM Chairman directive for shorter schedules • Process oriented • Main focus: software organizations • Qualitative analysis • IBM West Coast assessments circa 1975: • Used for large data base products (IMS and CICS) • Triggered by IBM Chairman directive for higher quality • Project oriented • Main focus: specific software projects • Quantitative analysis and Qualitative analysis
SIGNIFICANT IBM SOFTWARE INNOVATIONS • Software defect severity scale 1956 • Automated change management tools 1967 • Software process assessments 1970 • Design and code inspections 1972 • Automated cost and quality estimation 1973 • Joint Application Design (JAD) 1974 • Function point metrics 1975 • Backfiring LOC to function points 1978 • Orthogonal defect classification 1995
COMMON FEATURES OF SEI AND SPR METHODS • Process improvements derive from assessment results • Large software systems are difficult • Quality is a key factor for controlling large systems • Process planning is a key factor for controlling large systems • Project planning and cost estimating are key factors • Project management is a key factor • Change management is a key factor • Milestone tracking is a key factor • Measurement of quality and productivity are key factors
SOFTWARE ASSESSMENT METHOD ORIGINS • IBM internal assessments 1970 • ITT internal assessments 1978 • SPR Assessments 1984 • SEI Assessments 1985 • ISO 9000-9004 AUDITS 1992 • TickIT assessments 1993 • SPICE assessments 1994
SOFTWARE ASSESSMENT METHOD ORIGINS • Six-Sigma assessments 1995 • Gartner Group assessments 1995 • David Group assessments 1996 • CMMI assessments 2001 • Sarbanes-Oxley assessment 2004 • Security assessments 2005 • Cloud and SOA assessments 2009
SPR AND SEI - SIMILARITIES • SPR and SEI both founded in 1984 (SPR late 1983) • Both SPR and SEI assessments started < 1985 • Both based on IBM’s assessment methods • Watts Humphrey and Capers Jones both from IBM • SPR assessment results first published in 1986 • SEI assessment results first published in 1989 • Both make process improvement recommendations
SPR AND SEI - DIFFERENCES • SPR SEI • Project Level Organization Level • Multiple choice Binary Answers • Measures Against Measures From • Industry Average Initial Starting Point • Focuses On: Focuses On: • - Solutions - High Level View • - Improvement - Current Status • - Qualitative data - Qualitative data • - Quantitative data
SPR AND SEI QUESTION EXAMPLES SEI APPROACH USES BINARY QUESTIONS: Do you have a formal software quality assurance (SQA) program?
SPR AND SEI QUESTION EXAMPLES SPR APPROACH USES MULTIPLE CHOICE QUESTIONS: Software Quality Assurance (SQA) program: _______ 1. Formal SQA group with adequate resources (1 to 10 ratio) 2. Formal SQA group, but understaffed (< 1 to 20 ratio) 3. SQA role assigned to development personnel 4. Informal or intermittent SQA tasks done by development personnel 5. No SQA function exists for the project
USING SPR AND SEI TOGETHER • SEI - Top Down View • Organization Structure • Engineering Process • Measure Development Process Maturity • Current Status • SPR - Bottom Up View • Productivity Levels • Quality Levels • Industry Benchmark Comparisons • Improvement Recommendations
SPR AND SEI SCALES SPR Excellence Scale SEI Maturity Level 1. Excellent 1. Initial 2. Good 2. Repeatable 3. Average 3. Defined 4. Below Average 4. Managed 5. Poor 5. Optimizing
SEI CMM SCORING SYSTEM Definition Frequency 1 = Initial 75.0% 2 = Repeatable 15.0% 3 = Defined 8.0% 4 = Managed 1.5% 5 = Optimizing 0.5%
SPR SCORING SYSTEM • Frequency Military Systems MIS • Definition (Overall) Frequency Frequency Frequency 1 = Excellent 2.0% 1.0% 3.0% 1.0% 2 = Good 18.0% 13.0% 26.0% 12.0% 3 = Average 56.0% 57.0% 50.0% 65.0% 4 = Poor 20.0% 24.0% 20.0% 19.0% 5 = Very Poor 4.0% 5.0% 2.0% 3.0%
DISTRIBUTION OF SPR ASSESSMENTS Excellent AVERAGE Poor 6 3 0 1 2 4 5 Bell-shaped curve of SPR Results
THE CMMI STRUCTURE Maturity Levels indicate contain Process Capability Key Process Areas achieve organized by Common Features Goals address contain Key Practices Implementation or Institutionalization describe Infrastructure or Activities
THE KEY PROCESS AREAS BY MATURITY LEVEL Optimizing (5) Process Change Management Technology Change Management Defect Prevention Managed (4) Software Quality Management Quantitative Process Management Defined (3) Peer reviews Intergroup Coordination Software Product Engineering Integrated Software Management Training Program Organization Process Definition/Focus Repeatable (2) Software Configuration Management Software Quality Assurance Software Subcontractor Management Software Project Tracking and Oversight Software Project Planning Requirements Management Initial (1)
LEADING EDGE, MEDIAN AND TRAILING EDGE RESULTS FROM THE SPR SURVEYS Leading Trailing Edge Median Edge Companies Companies Companies Above Average Responses Average Responses Below Average Responses 5% 65% 30% 15% 70% 15% 30% 65% 5%
RATES OF PROCESS IMPROVEMENT CORRELATED TO INITIAL RANKING Excellent 1 Good 2 Average 3 Mediocre 4 Poor 5 Year 1 Year 2 Year 3 Year 4 Year 5
ASSESSMENTS IN SIX SOFTWARE CLASSES • Systems software Controls physical devices • Military software Follows DoD standards • Information systems software In-house MIS projects • Contract/outsourced software Legal contract obligations • Commercial software Many diverse customers • End-user software Personal applications
SYSTEMS SOFTWARE ASSESSMENTS • COMPANIES BUILDING SYSTEMS SOFTWARE INCLUDE: • AT&T, Boeing, Ford, HP, IBM, Microsoft, Siemens, UNISYS • SYSTEMS SOFTWARE EXAMPLES: • IBM MVS operating system • AT&T 5ESS switching system • Ford Motors fuel injection controls • Microsoft’s Windows 98
SYSTEMS SOFTWARE ASSESSMENTS • SYSTEMS SOFTWARE STRENGTHS (ABOVE AVERAGE) • Software quality control • Software quality estimation • Software quality measurement • Most effective use of complexity analysis • Most likely to use formal inspections • Most effective processes > 10,000 function points • Superior in-house training • Most effective use of specialists
SYSTEMS SOFTWARE ASSESSMENTS • SYSTEMS SOFTWARE WEAKNESSES (BELOW AVERAGE) • Lags in use of function point metrics • Lags in use of automated cost estimation tools • Lags in productivity measurements • Below average productivity levels • Creates enormous volumes of paperwork • May be troubled by downsizings and layoffs
SYSTEM SOFTWARE FACTORS • Excellent quality • High reliability • Competitive feature set • Predictable schedules (5%) • Predictable costs (5%) • Stable requirements (5%) • Meets industry standards • Upwards compatibility • Plug and play connectivity Failure Factors Success Factors • Low quality • High reliability • Uncompetitive feature set • Schedules out of control • Costs out of control • Requirements unstable • Violates industry standards • Compatibility problems • Non standard interfaces
MILITARY SOFTWARE ASSESSMENTS • ORGANIZATIONS BUILDING MILITARY SOFTWARE INCLUDE: • Air Force, Army, Grumman, IBM, Litton, Navy, Raytheon • MILITARY SOFTWARE EXAMPLES: • Navy’s Tomahawk navigation package • Raytheon’s Patriot target acquisition package • World-wide military command and control system (WWMCCS)
MILITARY SOFTWARE ASSESSMENTS • MILITARY SOFTWARE STRENGTHS (ABOVE AVERAGE) • Software process assessments using SEI approach • Extensive use of macro cost estimation tools (COCOMO) • Effective software reuse programs • Effective software COTS programs • Most complete requirements specifications • Effective change control of all deliverables • Very good in software quality
MILITARY SOFTWARE ASSESSMENTS • MILITARY SOFTWARE WEAKNESSES (BELOW AVERAGE) • Lags in use of function point metrics • Lags in use of activity-level cost estimation tools • Lags in quality estimation • Lags in productivity measurements • Lowest productivity levels (lowest of any industry) • Creates largest volumes of paperwork ever recorded • Often troubled by downsizings and layoffs • Frequent litigation
MILITARY SOFTWARE FACTORS • Contract let without litigation • Project meets DoD standards • Project uses best practices • Project meets requirements • Stable requirements (15%) • Predictable schedules (10%) • Predictable costs (10%) • Project passes CDR • Project actually used Failure Factors Success Factors • Contract challenged in court • Project violates DoD standards • Project uses worst practices • Project omits requirements • Requirements unstable • Schedules out of control • Costs out of control • Project fails CDR • Project is not used
MIS SOFTWARE ASSESSMENTS • COMPANIES BUILDING MIS SOFTWARE INCLUDE: • AT&T, Bank of Boston, CIGNA, Nielsen, State governments • MIS SOFTWARE EXAMPLES: • Corporate billing systems • Insurance claims handling systems • State Government pension management systems • Human resource systems
MIS SOFTWARE SYSTEMS ASSESSMENTS • MIS SOFTWARE STRENGTHS (ABOVE AVERAGE) • Highest productivity levels < 1000 function points • Leads in use of function point metrics • Best productivity measurement systems and tools • Most effective sizing technologies • Use of JAD for requirements gathering • Use of geriatric tools for legacy applications
MIS SOFTWARE ASSESSMENTS • MIS SOFTWARE WEAKNESSES (BELOW AVERAGE) • Lags in use of formal design and code inspections • Lags in use of testing specialists • Lags in use of software quality assurance (SQA) • Lags in quality measurements • Below average quality levels • Highest rate of failure > 10,000 function points
MIS FACTORS • Adds new business capability • Returns positive value • Improves operating speed • Reduces operating costs • Enhances competitiveness • Meets all user requirements • High user satisfaction • Stable requirements (5%) • Predictable schedules (5%) • Predictable costs (5%) Success Factors Failure Factors • Degrades business capability • Returns null or negative value • Degrades operating speed • Raises operating costs • Degrades competitiveness • Omits major requirements • Low user satisfaction • Unstable requirements • Schedules out of control • Costs out of control
OUTSOURCE SOFTWARE ASSESSMENTS • MAJOR U.S. OUTSOURCE COMPANIES INCLUDE: • AMS, ANDERSEN, CSC, EDS, IBM, LOCKHEED, UNISYS • OUTSOURCE SOFTWARE EXAMPLES: • Maintenance contracts for entire corporations • Year 2000 repairs for entire corporations • State Government child support systems • State Government motor vehicle registration systems
OUTSOURCE SYSTEMS ASSESSMENTS • OUTSOURCE SOFTWARE STRENGTHS (ABOVE AVERAGE) • High productivity levels > 10,000 function points • Leader in use of function point metrics for contracts • Leader in project management tool usage • Leader in cost estimating tool usage • Leader in software reusability programs • Good productivity measurement systems and tools • Good internal training for managers and staff
OUTSOURCE SOFTWARE ASSESSMENTS • OUTSOURCE SOFTWARE WEAKNESSES (BELOW AVERAGE) • Creeping requirements are very troublesome • Contract baselines are often ambiguous and troublesome • Quality levels may slip if schedule pressures are severe • More unpaid overtime than any industry • May have high staff turnover rates • Problems mount as subcontractors increase • Litigation may occur due to overruns
OUTSOURCE SOFTWARE FACTORS • Contract ends without litigation • Contract benefits both parties • Contract costs are acceptable • Software works well at delivery • Software is maintainable • Meets all user requirements • Stable requirements (5%) • Predictable costs (5%) • Predictable schedules (5%) • High quality (> 95% removal) Failure Factors Success Factors • Contract ends with litigation • Contract harmful to one or both • Contract costs excessive • Software delivery problems • Software unmaintainable • Major requirements left out • Unstable requirements • Costs out of control • Schedules out of control • Poor quality (< 85% removal)
COMMERCIAL SOFTWARE ASSESSMENTS • COMPANIES BUILDING COMMERCIAL SOFTWARE INCLUDE: • Artemis, Computer Associates, IBM, Microsoft, Oracle, SAP • COMMERCIAL SOFTWARE EXAMPLES: • Microsoft Office 97 • SAP R3 • SPR KnowledgePlan • CA Unicenter
COMMERCIAL SOFTWARE ASSESSMENTS • COMMERCIAL SOFTWARE STRENGTHS (ABOVE AVERAGE) • Leader in software change control • Leader in software user satisfaction measurements • Leader in software testing • Most extensive libraries of test cases • Most extensive suites of test tools • Most extensive suites of development tools • Most sophisticated nationalization tools and methods
COMMERCIAL SOFTWARE ASSESSMENTS • COMMERCIAL SOFTWARE WEAKNESSES (BELOW AVERAGE) • Lags in use of function point metrics • Lags in use of cost estimation tools • Lags in use of quality estimation tools • Lags in productivity measurements • Lags in quality measurements • Lags in use of design and code inspections • Massive amounts of unpaid overtime • Significant turnover rates and morale issues
COMMERCIAL SOFTWARE FACTORS • Product has high market shares • Product has high user scores • Product is profitable • Product wins in litigation • Product features protected • Features beat competition • Short time to market • High quality levels • Leads to follow-on business • Excellent customer support • Excellent customer references Success Factors Failure Factors • Product has low market shares • Product gets low user scores • Product loses money • Product loses in litigation • Product readily copied • Features lag competition • Long time to market • Poor quality levels • No follow-on business • Poor customer support • Poor customer references
END-USER SOFTWARE ASSESSMENTS • COMPANIES BUILDING END-USER SOFTWARE INCLUDE: • All corporations and most government agencies • END-USER SOFTWARE EXAMPLES: • Excel spreadsheets for financial analysis • Visual Basic applications for competitive analysis • SQL query strings for personnel data
END-USER SOFTWARE ASSESSMENTS • END-USER SOFTWARE STRENGTHS (ABOVE AVERAGE) • Highest productivity levels < 100 function points
END-USER SOFTWARE ASSESSMENTS • END-USER SOFTWARE WEAKNESSES (BELOW AVERAGE) • Lags in use of formal design and code inspections • Lags in testing tools and methods • Worst quality levels of any form of software • Lacks written requirements, specifications, plans • Maintenance may be impossible • Ownership of end-user software may be uncertain
END-USER FACTORS • Adds new business capability • Returns positive value • Improves operating speed • Reduces operating costs • Does not harm other packages • Does not corrupt data bases Success Factors Failure Factors • Degrades business capability • Returns null or negative value • Degrades operating speed • Raises operating costs • Damages software package • Damages data bases
FACTORS THAT INFLUENCE SOFTWARE • MANAGEMENT FACTORS • SOCIAL FACTORS • TECHNOLOGY FACTORS • COMBINATIONS OF ALL FACTORS
MANAGEMENT FACTORS • Cancel Delays On time Early • 1) Manual estimates 40% 45% 15% 0% • Manual plans • Informal tracking • Minimal quality control Worst-case Scenario Probability of Selected Outcomes
MANAGEMENT FACTORS (cont.) Single-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 2) Manual estimates 37% 42% 20% 1% Automated plans Informal tracking Minimal quality control 3) Manual estimates 35% 39% 24% 2% Manual plans Formal tracking Minimal quality control 4) Automated estimates 33% 36% 28% 3% Manual plans Informal tracking Minimal quality control 5) Manual estimates 30% 32% 34% 4% Manual plans Informal tracking Optimal quality control
MANAGEMENT FACTORS (cont.) Two-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 6) Manual estimates 27% 28% 40% 5% Automated plans Formal tracking Minimal quality control 7) Automated estimates 23% 26% 45% 6% Automated plans Informal tracking Minimal quality control 8) Automated estimates 20% 23% 50% 7% Manual plans Formal tracking Minimal quality control
MANAGEMENT FACTORS (cont.) Two-factor Scenario Probability of Selected Outcomes Cancel Delays On time Early 9) Manual estimates 18% 20% 54% 8% Automated plans Informal tracking Optimal quality control 10) Manual estimates 16% 17% 58% 9% Manual plans Formal tracking Optimal quality control 11) Automated estimates 13% 15% 62% 10% Manual plans Informal tracking Optimal quality control