230 likes | 387 Views
Six Sigma CL Testing Quality Delivery Rerun Reduction Project. Final Deck Submitted by: Dave Jarvas & Jennifer Nortier January 31, 2007. D M AE C. Six Sigma Rerun Reduction. efine. Project Charter: CL QD Test Execution Rerun Reduction.
E N D
Six Sigma CL TestingQuality Delivery Rerun Reduction Project Final Deck Submitted by: Dave Jarvas & Jennifer Nortier January 31, 2007
D M AE C Six Sigma Rerun Reduction efine
Project Charter: CL QD Test Execution Rerun Reduction Prepared by: Jennifer Nortier/ Dave JarvasChampion: Greg Geppert Key Stakeholders: Quality Delivery, Kanbay Offshore testers, HTS, P&C Problem Statement & Goal Business Case • Quality Delivery utilizes Kanbay offshore testers to execute test plans. Currently test plan volumes are overstated by 35% to account for additional rework in the process leading to the need for more resources. The majority of this 35% can be attributed to the amount of test plans that need to be rerun during test execution. The reruns require offshore testers to re-execute the same test plan more than once instead of moving onto unexecuted test plans. • Reduce the Kanbay Rerun rate by 40% by the December 2006 release. (Baseline of February at 47%. To achieve this goal, the rerun rate must be at 28% or below.) • Reduce Kanbay Offshore expense by 19% per month. • (Assumptions: 56 Kanbay testers/month at $2000/tester/month • with avg of 500 Test Plans/month=> Avg cost of Test Plan= $224) Quality Delivery is challenged to do more testing with fewer resources without sacrificing quality. Reducing reruns will increase productivity leading to a reduction in offshore expense. With the increase in productivity, Quality Delivery would have increased bandwidth to introduce more projects into production to satisfy customer needs. Project Milestones Project Scope & Background • DEFINE Project Charter Signed Off by 5/17/2006 • Tollgate 7/31/2006 • MEASURE Tollgate 8/28/2006 • ANALYZE 10/30/2006 • ENGINEER 12/15/2006 • CONTROL Beginning with the Feb 2007 Release • Metrics: • Detailed rerun rates including the last rerun reason from all releases beginning with February 2006 to present. • Offshore chargeback figures. • Offshore productivity measures. • Onshore & Offshore project resource allocations. • Time Tracker for onshore TC time. • Scope: The focus of this project is on reducing rerun rates for QD test plans executed by the Kanbay testers. All Test Sets included. Solutions will be focused on areas within QD’s control. • Resources: • Quality Delivery Manager: Craig Manning 5% • Quality Delivery Test Coordinator: Valerie Forbes & Moe Olia 5% • Quality Delivery Process Engineering: Traci Keiner 10% • HTS: Beth Young 3% (Adhoc Member) • Offshore: Suresh Ganesh 5% • P&C: Kimsi Hazariani 3% (Adhoc Member) • Barriers: Core members not available to allocate time to project.
Six Sigma Rerun Reduction D • M AE C easure
Data Collection- ReRun Trend Data collection was focused on rerun data including volume and rerun reasons. This data was available from Quality Center beginning with the February 2006 release.The graph below shows the rerun volume and % of testplans that were rerun in each release.
Data Collection- ReRun Reason Stats The table below is an example of our daily ReRun Execution Report that displays the percentage of ReRun reasons for each project in a release. February, March, and May data were collected and used for the ReRun Reason Pareto chart (shown in the Analyze section of this deck).
Six Sigma Rerun Reduction D M • AE C nalyze
ReRun Reason Pareto Chart A Pareto Chart was used to identify the specific rerun reasons to drill into for the root causes. The System Defects is outside of our control, so we focused on Execution Error, Invalid Test Data and Environmental Error. Includes February, March, and May 2006 Releases
ReRun Reason 5 Why’s: Execution Error Why 1 Why 2 Why 3 Why 4 Why 5 Requirements based on outdated functionality Lack of communication across the board PCR Missed Req Lack of knowledge of Reqs author Not enough training Scope change (lump in another proj) Functionality Changes Functional Spec Omission/ Change Lack of clarity in BRD Execution Error Lack of software knowledge User unclear on process Not enough training User did not follow process Lack of Application knowledge Steps not clear& tester made assumptions Tester does not fully understand the App/System Not enough training
ReRun Reason 5 Why’s: Invalid Test Data Why 1 Why 2 Why 3 Why 4 Why 5 Wrong condition on the test beds Strategic Decisions/ Business Needs Lack of product knowledge Products change frequently Invalid Test Data Program codes & Law records not set up properly P&C timelines do not match up to QD timelines No single voice to priorities. need communication at Release level. Competing priorities
ReRun Reason 5 Why’s: Environmental Error Why 1 Why 2 Why 3 Why 4 Why 5 Budget Constraints on network bandwidth Application Response Time Unexpected growth Inaccurate capacity planning Environmental Error Maintenance/ Upgrades Test environment unavailable Server goes down Hardware failure Inefficient communication between Test Coordinator to Release Coordinator to HTS. Test Coordinator did not realize they needed to ask for a batch job to be run. Lack of process/system knowledge Not enough training Batch job not run Lack of process/system knowledge User error Not enough training User error (batch job not submitted) Lack of communication Not knowing the right contacts Not enough training
Root causes for the 3 rerun reasons can be summarized as follows: Not enough training on systems, processes, business Lack of/not enough communication between areas *** Solutions will center around these above root causes. Root causes identified that are outside of our control (thus outside the scope of this project) came out of the Environmental Error reason: Budget Network capacity planning Unexpected growth Hardware failure Maintenance/upgrades Root Causes
Six Sigma Rerun Reduction D M A • E C ngineer
ReRun Reduction Proposed Solutions Execution Errors Root Causes: Not enough training & Lack of communication • Expand the SOP to include specific definitions behind each rerun reason and when a rerun is warranted. Provide specific examples. • Review offshore training material & confirm process documentation matches SOP. • Conduct a Test Coordinator knowledge transfer after release. Focus on insights gained on products, workflow, & application. • Create a consistent process surrounding offshore communication on when to continue executing the current run and when to rerun. Invalid Test Data Root Causes: Lack of product knowledge & No single voice of priorities. • Add PM milestone to have project team members share all product information impacted by the project with the entire team. • Utilize new P&C tool “CARS” for looking up existing product information. • QD Release coordinator gather all P&C requirements and send prioritized list with dates to a designated P&C coordinator. • Create a program code / law record checklist for P&C (Mary Gerred is currently working on this). • Conduct a refresher course for QD and offshore on SAFE & DCT’s. (How to look up items for their project / double check setup was done). • Project Manager & Test Coordinator need to be on the same page with expectations and timelines. Clear and consistent communication. Environmental Errors Root Causes: Not enough training & documentation on batch processes. • Confirm QD has a release dependency matrix that outlines batch jobs that need to be run for each release. • Add a PM milestone to have HTS confirm which batch jobs are required to be run for the specific project. • Obtain a list from HTS of the “On Request Jobs” with a short description on what it is used for. • Create a batch job / project matrix to be used within QD, updated by the Test Coordinators and stored in the QD Asset Library for continual reference. • Use the daily 3:00pm release call with HTS to specifically remind the team (HTS & PPD) of any batch jobs that are to be run that night.
What can be implemented the fastest and provide the biggest impact? All items outlined under Execution errors except for the Test Coordinator knowledge transfer could be done within 30 days. Use HTS & PPD release calls for reminders of batch job runs expected for that night. What will require time to create and setup? Training classes surrounding “CARS” and SAFE/DCTs. Batch job/project matrix. List of HTS “On Request Jobs”. What requires on-going discussion (longer term solutions)? Adding new PM milestones. P&C coordinator. ** Please note that the process flow did not change. Therefore the “As Is” is the same as the “Should Be”. Prioritizing Solutions
Financial Benefit Analysis By reducing the QD Rerun rate by 40% (moving from 47% to 28%) we are looking at reducing Kanbay Offshore expense by 19% per month. Assumptions: 56 Kanbay testers/month $2000/tester/monthwith Average of 500 Test Plans/month Average cost of Test Plan= $224 Calculation: 500 X (1.47) “Feb Rerun Rate”X $224= $164,640 cost/month 500 X (1.28) “Goal Rerun Rate” X $224=$143,360 new cost/month Benefit: $164,640 - $143,360 = $21,280 savings/month
Six Sigma Rerun Reduction D M A E • C ontrol
Next Steps Continue Implementation • Focus on the outstanding items to strengthen the test execution process to minimize reruns. • Review long term solutions and ensure they are still viable solutions as the organization changes. • Ensure lessons learned and process improvements are being implemented consistently with our new GSC team. (The scope of the project included only the Kanbay testing team which will be gone by the end of 2007.) Continue Release Metrics • Quality Center reports are produced on a daily basis and are analyzed by QD Mgmt, the Process Support Team and Offshore Mgmt. Continued focus will be on daily rerun review. • Continue review of metrics broken out by testing team allowing us to review rerun stats by GSC versus Kanbay. Evaluate Performance • Halfway through our Re-Run Reduction project, the direction of Quality Delivery changed from utilizing Kanbay to GSC. The GSC Performance as well as Kanbay’s performance will be analyzed with every release. This release reporting package shows trends on several key performance indicators including rerurns. • Include reruns in the PIR (Post Implementation Review) completed after each release which focuses on what went well and what needs to be changed. This forum will be used to identify additional areas of opportunity to reduce re-execution.