1 / 45

Team Software Project (TSP) June 26, 2006 System Test

Team Software Project (TSP) June 26, 2006 System Test. Outline. Remaining Session Plan & Discussion System Test Plan Discussion Mythical Man Month System Test Plan Recap Metrics Presentations More on Measurement Next Phases Cycle 1 Test Cycle 1 Post-Mortem & Presentations

roger
Download Presentation

Team Software Project (TSP) June 26, 2006 System Test

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Team Software Project (TSP) June 26, 2006 System Test SE 652- 2007_6_26_TestResults_PSMp1.ppt

  2. Outline Remaining Session Plan & Discussion System Test Plan Discussion Mythical Man Month System Test Plan Recap Metrics Presentations More on Measurement Next Phases Cycle 1 Test Cycle 1 Post-Mortem & Presentations Cycle 2 Plan & Strategy SE 652- 2007_6_26_TestResults_PSMp1.ppt

  3. Due Today Key Metrics Presentation (10-15 minutes) All Implementation Quality Records (LOGD, CCRs, etc.) Final code (source & executable) Updated Products (code components, SRS, HLD, User Documentation) Intermediate Products (e.g. Unit Test Plans) Configuration Management Plan Release CD: Application User Guide Release Letter No class on July 3 SE 652- 2007_6_26_TestResults_PSMp1.ppt

  4. Project Performance Discussion

  5. Remaining Lectures Plan/Discussion July 10 – Cycle 1 Test Complete & Post-Mortem Cycle 1 Results Presentation & Discussion Cycle 1 Reports & Post-Mortem Measurement Team audit July 17 – Cycle 2 Launch Cycle 2 Launch, Project & Measurement Planning Peopleware Topics: Management, Teams, Open Kimono, Quality, Hiring/Morale, … July 24 – Cycle 2 Requirements Complete Cycle 2 Requirements Death March Projects: July 31 – Cycle 2 Implementation Complete System Test Plan Baselined Cycle 2 Design & Implementation Process topics – CMMI, TL-9000, ISO August 7 – Cycle 2 Test Complete Cycle 2 Test Complete Cycle 2 Post-Mortem Complete August 14 - Course Review Course Review Class exercise Final SE 652- 2007_6_26_TestResults_PSMp1.ppt

  6. Remaining Course Topics Discussion

  7. System Test Schedule Note: Assumes system has already passed Integration Test Full feature to system test and instructor by COB June 25 including: Test environment Executable User documentation (note: CCRs can be filed against user documentation) Source code Tester generates CCRs for all finds & fills out LOGTEST Email to instructor when generated (see below) Development team updates LOGD referencing CCRs Required turn-around times for fixes 80% within 24 hours 99% within 48 hours Required test coverage short of blocking issues 80% First Pass Test Complete by June 28 100% First Pass Test Complete by July 1 Regression Test Complete by July 3 Daily test reports to instructor detailing test cases executed, results & CCRs SE 652- 2007_6_26_TestResults_PSMp1.ppt

  8. System Test Plan Recap Areas to cover: Installation Start-up All required functions available & working as specified Diabolical (e.g. power failures, corner cases, incorrect handling) Performance Usability Includes: Test cases you plan to run (numbered / named) Expected results Ordering of testing & dependencies Supporting materials needed Traceability to requirements SE 652- 2007_6_26_TestResults_PSMp1.ppt

  9. Release “Letters” Purpose What’s in it? • Version Information • Release contents Examples: • All functionality defined in Change Counter Requirements v0.6 except GUI • Phase 1 features as defined in project plan x.y • Feature 1, Feature 2, Feature 3 as defined by … • Known Problems • Change Request IDs w/ brief customer oriented description • Fixed Problems • Upgrade Information • Other? SE 652- 2007_6_26_TestResults_PSMp1.ppt

  10. Implementation Status Implementation experience Unit/Integration experience Problems / Rework? PIP forms SE 652- 2007_6_26_TestResults_PSMp1.ppt

  11. Team Presentation

  12. Project Measurement Source: Practical Software MeasurementJohn McGarry, et.al.

  13. Measurement “If you can’t measure it, you can’t manage it” Tom DeMarco SE 652- 2007_6_26_TestResults_PSMp1.ppt

  14. Fundamentals Don’t try to measure everything Align measures with: Project goals & risks (basic survival mode) Process improvement areas (continual improvement mode) Define measurement program up front Monitor continuously & take action where needed SE 652- 2007_6_26_TestResults_PSMp1.ppt

  15. Applications Improve accuracy of size & cost estimates Improve quality Understand project status Produce more predictable schedules Improve organizational communication Faster, better informed management decisions Improve software processes SE 652- 2007_6_26_TestResults_PSMp1.ppt

  16. Basic In-Process Measurement Examples Schedule Earned Value vs. Planned Value Schedule Variance Development Task completion Actual code completed vs. planned Project End Game Defect Creation vs. Closure Variations: severity System Test % Testing Complete Variations: passed, failed, blocked Test Time / Defect Test Coverage (vs. requirements, white box code coverage) SE 652- 2007_6_26_TestResults_PSMp1.ppt

  17. Process Improvement Measurement Examples Quality Defect density Post Deployment defect density Inspection Effectiveness Defects / inspection hour Estimation Accuracy SE 652- 2007_6_26_TestResults_PSMp1.ppt

  18. Why Measure? Support short & long term decision making Mature software organization (CMMI level?) uses measurement to: Plan & evaluate proposed projects Objectively track actual performance against plan Guide process improvement decisions Assess business & technical performance Organizations need the right kind of information, at the right time to make the right decisions SE 652- 2007_6_26_TestResults_PSMp1.ppt

  19. Measurement in Software Lifecycle Plan Do – carry out change Check – observe effects of change Act – decide on additional areas for improvement Repeat Considerations: Cost, schedule, capability, quality SE 652- 2007_6_26_TestResults_PSMp1.ppt

  20. Measurement Psychological Effects Measurement as measures of individual performance Hawthorne Effect Measurement Errors Conscious: rounding, pencil whipping (ie. False data entry) Unintentional: inadvertent, technique (ie. Consistent) SE 652- 2007_6_26_TestResults_PSMp1.ppt

  21. Use of Measures Process Measures – time oriented, includes defect levels, events & cost elementsUsed to improve software development & maintenance process Product Measures – deliverables & artifacts such as documentsincludes size, complexity, design features, performance & quality levels Project Measures – project characteristics and executionincludes # of developers, cost, schedule, productivity Resource Measures –resource utilizationincludes training, costs, speed & ergonomic data SE 652- 2007_6_26_TestResults_PSMp1.ppt

  22. Glossary Entity - object or event (e.g. personnel, materials, tools & methods) Attribute - feature of an entity (e.g. # LOC inspected, # defects found, inspection time) Measurement - # and symbols assigned to attributes to describe them Measure – quantitative assessment of a product/process attribute (e.g. defect density, test pass rate, cyclomatic complexity) Measurement Reliability – consistency of measurements assuming nochange to method/subject Software validity – proof that the software is trouble free & functions correctly (ie. high quality) Predictive validity – accuracy of model estimates Measurement errors – systematic (associated with validity) & random (associated w/ reliability) Software Metrics – approach to measuring some attribute Defect – product anomaly Failure – termination of product’s ability to perform a required function SE 652- 2007_6_26_TestResults_PSMp1.ppt

  23. PSM Measurement Process Measurement Plan Information need – e.g.: What is the quality of the product? Are we on schedule? Are we within budget? How productive is the team? Measurable Concept Measured entities to satisfy need (abstract level: e.g. productivity) Measurement Construct What will be measured? How will data be combined? (e.g. size, effort) Measurement Procedure Defines mechanics for collecting and organizing data Perform Measurement Evaluate Measurement SE 652- 2007_6_26_TestResults_PSMp1.ppt

  24. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute Measurement Construct Decision Criteria Analysis Model Measurement Function Measurement method Measurement method SE 652- 2007_6_26_TestResults_PSMp1.ppt

  25. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute Attributes Attribute Distinguishable property or characteristic of a software entity (Entities: processes, products, projects and resources) Qualitative or Quantitative measure SE 652- 2007_6_26_TestResults_PSMp1.ppt

  26. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute Base Measure Measure of an attribute (one to one relationship) Measurement method Attribute quantification with respect to a scale Method type Subjective (e.g. high, medium, low), Objective (e.g. KLOC) Scale Ratio Interval Ordinal Nominal Unit of measurement e.g. hours, pages, KLOC SE 652- 2007_6_26_TestResults_PSMp1.ppt

  27. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute Derived MeasureIndicator Derived Measure Function of 2 or more base measures Measurement Function Algorithm for deriving data (e.g. productivity = KLOC/developer hours) Indicator Estimate or Evaluation Analysis Model Algorithm / calculation using 2 or more base &/or derived measures + Decision Criteria Numerical thresholds, targets, limits, etc.used to determine need for action or further investigation SE 652- 2007_6_26_TestResults_PSMp1.ppt

  28. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute Measurement ConstructExamples Productivity Attributes: Hours, KLOC Base Measures: Effort (count total hrs), Size (KLOC counter) Derived Measure: Size / Effort = Productivity Analysis Model: Compute Mean, compute std deviation Indicator: Productivity: mean w/ 2  confidence limits Quality Attributes: Defects, KLOC Base Measures: # Defects (count defects), Size (KLOC counter) Derived Measures: # Defects / Size = Defect Rate Indicator: Defect rate control: baseline mean, control limits & measured defect rate SE 652- 2007_6_26_TestResults_PSMp1.ppt

  29. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute More Measurement ConstructExamples Coding Base Measure: Schedule (w.r.t. coded units) Derived Measure: Planned units, actual units Analysis Model: Subtract units completed from planned units Indicator: Planned versus actual units complete + variance SE 652- 2007_6_26_TestResults_PSMp1.ppt

  30. Indicator Derived Measure Derived Measure Base Measure Base Measure Attribute Attribute Class Measurement ConstructExamples Coding Base Measure: Derived Measure: Analysis Model: Indicator: SE 652- 2007_6_26_TestResults_PSMp1.ppt

  31. Measurement Planning Identify Candidate Information Needs Project Objectives Cost, schedule, quality, capability Risks Prioritize One approach: probability of occurrence x project impact = project exposure e.g. Schedule Budget Reliability Dependencies Product Volatility SE 652- 2007_6_26_TestResults_PSMp1.ppt

  32. PSM Common Information Categories Schedule & Progress Resources & Cost Product Size & Stability Product Quality Process Performance Technology Effectiveness Customer Satisfaction SE 652- 2007_6_26_TestResults_PSMp1.ppt

  33. PSM Common Information CategoriesMeasurement Concepts Schedule & Progress - milestone dates/completion, EV/PV Resources & Cost - staff level, effort, budget, expenditures Product Size & Stability - KLOC/FP, # requirements, # interfaces Product Quality - defects, defect age, MTBF, complexity Process Performance - productivity, rework effort, yield Technology Effectiveness - requirements coverage Customer Satisfaction - customer feedback, satisfaction ratings, support requests, support time, willingness to repurchase SE 652- 2007_6_26_TestResults_PSMp1.ppt

  34. Select & Specify Measures Considerations Utilize existing data collection mechanisms As invisible as possible Limit categories & choices Use automated methods over manual Beware of accuracy issues (e.g. timecards) Frequency needs to be enough to support ongoing decision making(alternative: gate processes) SE 652- 2007_6_26_TestResults_PSMp1.ppt

  35. Measurement Construct SE 652- 2007_6_26_TestResults_PSMp1.ppt

  36. Project Measurement Plan Template (from PSM figure 3-10, p 56) Introduction Project Description Measurement Roles, Responsibilities & Communications Description of Project Information Needs Measurement Specifications (i.e. constructs) Project Aggregation Structures Reporting Mechanisms & Periodicity SE 652- 2007_6_26_TestResults_PSMp1.ppt

  37. Team Project Postmortem Why Insanity Continuous improvement Mechanism to learn & improve Improve by changing processes or better following current processes Tracking process improvements during project Process Improvement Proposals (PIP) Post-Mortem Areas to consider Better personal practices Improved tools Process changes SE 652- 2007_6_26_TestResults_PSMp1.ppt

  38. Cycle 2 Measurement Plan Identify cycle 2 risks & information needs Review & revise measures & create measurement constructs Document in a measurement plan SE 652- 2007_6_26_TestResults_PSMp1.ppt

  39. Postmortem process Team discussion of project data Review & critique of roles SE 652- 2007_6_26_TestResults_PSMp1.ppt

  40. Postmortem process Review Process Data Review of cycle data including SUMP & SUMQ forms Examine data on team & team member activities & accomplishments Identify where process worked & where it didn’t Quality Review Analysis of team’s defect data Actual performance vs. plan Lessons learned Opportunities for improvement Problems to be corrected in future PIP forms for all improvement suggestions Role Evaluations What worked? Problems? Improvement areas? Improvement goals for next cycle / project? SE 652- 2007_6_26_TestResults_PSMp1.ppt

  41. Cycle Report Table of contents Summary Role Reports Leadership – leadership perspective Motivational & commitment issues, meeting facilitation, req’d instructor support Development Effectiveness of development strategy, design & implementation issues Planning Team’s performance vs. plan, improvements to planning process Quality / Process Process discipline, adherence, documentation, PIPs & analysis, inspections Cross-team system testing planning & execution Support Facilities, CM & Change Control, change activity data & change handling, ITL Engineer Reports – individual assessments SE 652- 2007_6_26_TestResults_PSMp1.ppt

  42. Role Evaluations & Peer Forms Consider & fill out PEER forms Ratings (1-5) on work, team & project performance, roles & team members Additional role evaluations suggestions Constructive feedback Discuss behaviors or product, not person Team leaders fill out TEAM EVALUATION form SE 652- 2007_6_26_TestResults_PSMp1.ppt

  43. Cycle 1 Project Notebook Update Updated Requirements & Design documents Conceptual Design, SRS, SDS, System Test Plan, User Documentation* Updated Process descriptions Baseline processes, continuous process improvement, CM Tracking forms ITL, LOGD, Inspection forms, LOGTEST Planning & actual performance Team Task, Schedule, SUMP, SUMQ, SUMS, SUMTASK, CCR* SE 652- 2007_6_26_TestResults_PSMp1.ppt

  44. Due July 10 Class Cycle 1 Reports / Post-Mortem Cycle 1 Results Presentation Cycle 2 Project Plan Cycle 2 Measurement Plan SE 652- 2007_6_26_TestResults_PSMp1.ppt

  45. Cycle 1 Audit

More Related