220 likes | 464 Views
CANADIAN PATROL FRIGATE SOFTWARE MAINTENANCE TESTING. BY LCDR T.L. WILLIAMS. CPF SOFTWARE. 1,100,000 SLOC 223 MODULES 10 YEAR DEVELOPMENT TESTED AS PART OF SIX YEAR AT SEA TRIALS PROGRAM BUILD 36 FINAL CPF CONTRACT BUILD. CPF MAINTENANCE CONTRACT ISSUES.
E N D
CANADIAN PATROL FRIGATESOFTWARE MAINTENANCE TESTING BY LCDR T.L. WILLIAMS
CPF SOFTWARE • 1,100,000 SLOC • 223 MODULES • 10 YEAR DEVELOPMENT • TESTED AS PART OF SIX YEAR AT SEA TRIALS PROGRAM • BUILD 36 FINAL CPF CONTRACT BUILD
CPF MAINTENANCE CONTRACT ISSUES • GOVERNMENT OWNED CONTRACTOR OPERATED (GOCO) • MOVE PGC TO HALIFAX • CM REMAINED IN MONTREAL • NAVY VERSION 1.0 PROMISE • DND REORGANIZATION/STAFFING • TRANSITION OF CPF SOFTWARE TO IN-SERVICE AUTHORITY
NAVAL COMMUNITY TECHNICALAUTHORITY HSSF ORGANIZATION SMaRT SOFTWAREMANAGER ADMIN ASSISTANT QA CM SENSORS SUPPORT TEWA COMMS COMMAND SUPPORT ASW PROGRAM GENERATIONCENTER TRACK MANAGEMENT
TESTING CPF MAINTENANCE CONTRACT STATED: “TESTING SHALL BE AT THE SAME LEVEL AS THE CPF PRIME CONTRACT”
NAVY VERSION 1.0 PROBLEMS • REQUIREMENTS • WORK SPLIT BETWEEN SITES • POOR PLANNING • FACILITY SCHEDULING • VERSION RELEASE DOCUMENT
NAVY VERSION 1.0 PROBLEMS (CONT) • RELEASING AUTHORITY • CONTRACTOR TESTING • AT SEA TESTING • NUMEROUS DEFECTS • STAFFING/EXPERIENCE LEVELS • NO MEANINGFUL METRICS COLLECTED
SUCCESS OR FAILURE? • POLITICALLY FACILITY WAS JUDGED TO BE A SUCCESS • STANDING UP THE FACILITY • SUCCESS • PRODUCING THE FIRST IN-SERVICE BUILD • FAILURE
CHANGES TO FACILITY • REQUIREMENTS GROUP STOOD-UP • CONTINUOUS IMPROVEMENT GROUP ESTABLISH • INDEPENDENT TESTING GROUP ESTABLISHED • INCREASED STAFFING FOR SMaRT
CHANGES TO FALICITY (CONTINUED) • MINI-SYSTEM INSTALLED • AUTOMATED CONFIGURATION MANAGEMENT • CLOSER TIES TO REQUIREMENTS GROUP DEVELOPED • TRANSITION OF SOFTWARE TO IN-SERVICE AUTHORITY • FACILITY REORGANIZED
QUALITY ASSURANCE ASW/COMM/TM TEWA/SENSOR CONFIGURATION MANAGEMENT LINK SUPPORT INFORMATION TECHNOLOGY COMMAND SUPPORT TESTING NAVAL COMMUNITY TECHNICAL AUTHORITY HSSF(H) REORGANIZATION CSST(H) SOFTWARE SYSTEM ENGINEER ADMIN ASSISTANT SOFTWARE MANAGER SOFTWARE DEVELOPMENT MANAGER SPECIAL PROJECTS MANAGER INTEGRATED SUPPORT MANAGER
SOFTWARE SYSTEMS ENGINEER RESPONSIBILITIES • ASSESS IMPACT OF SOFTWARE AND SYSTEM CHANGES • COORDINATE SYSTEM STUDIES AND INVESTIGATIONS • REVIEW TEST PLANS AND TEST PROCEDURES • OBSERVE ALL FORMAL TESTS
QUALITY ASSURANCE • QUALITY MANAGEMENT • DOCUMENT, V & V SOFTWARE PROCESS • INCREASE PROCESS CAPABILITY AND MATURITY • RAISE QUALITY AWARENESS AND PROVIDE TRAINING • CONTINUOUS IMPROVEMENT
QUALITY ASSURANCE (CONTINUED) • QUALITY CONTROL • VERIFY ALL DEFECTS AND ENHANCEMENTS ARE DOCUMENTED AND TRACKED • WITNESS SPR CLOSURES, DELTA, AND SYSTEM TESTS • MONITOR SOFTWARE BUILDS AND DELIVERY PREPARATION
CONTINUOUS IMPROVEMENT HIGHLIGHTS • REQUIREMENTS REVIEW • PEER REVIEWS • TRAINING • 247 HOURS PROGRAMMERS • 173 HOURS TESTERS • NEW VERSION RELEASE PROCESS • INCREMENTAL BUILDS • NEW TESTING PROCESS
SOFTWARE TEST PROCESS Operational Evaluation Development Testing Version Testing Beta Site Testing Survivability Test Program Debug Shipboard Alongside Testing Delta Test SPR Closure System Test Beta Site Testing Shipboard At Sea Testing
BETA TESTING • OCCURS ON A CONTINUOUS BASIS • CFNOS AND CFNES INVOLVED • DEFECTS ARE DOCUMENTED • ATTEMPT TO REPRODUCE ON PREVIOUS VERSION • FREE PLAY
VERSION TESTING • VERSION SURVIVABILITY TEST • COMPARES SYSTEM PERFORMANCE WITH PREVIOUS VERSION PERFORMANCE • FULLY AUTOMATED • FOCUSES ON MISSILE THREAT PROFILES, TEWA PROCESSING • REPORT PRODUCED AND REVIEWED
DELTA TEST • INTEGRATION TESTING • WHITE BOX FOCUS • DEMONSTRATES ALL CHANGES FUNCTION CORRECTLY IN THE SAME BUILD • TEST REPORT PRODUCED/REVIEWED
SYSTEM TEST • NORMALLY STATIC • FOUR PHASES • EW/ASW (8 HRS) • AWW/RADAR (8 HRS) • FULL LOAD TRACKING (2 HRS) • OVERALL (6 HRS) • REPORT PRODUCED/REVIEWED
TESTING IMPROVEMENT S • TEST CELL ENHANCEMENT STUDY • FULLY DOCUMENT PROCESS • DEVELOMENT AND USE OF METRICS • FORMALIZATION OF PEER REVIEWS • AUTOMATED TESTING • DEFECT REPORTING • DEFECT DATA COLLECTING • TESTING/PROGRAMMING TOOLS
CONCLUSION • INITIAL TESTING POOR • TRYING TO DO TOO MUCH AT ONCE • TESTING PROCESS NOW IN PLACE • OBTAINING REPEATABLE RESULTS • FINE TUNING PROCESS STAGE