1 / 106

Peer Reviews

Department of Software and IT Engineering. Peer Reviews. Claude Y. Laporte Professor Department of Software Engineering and TI Rev 1 (2006.04.10).

lmetcalf
Download Presentation

Peer Reviews

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Department of Software and IT Engineering Peer Reviews Claude Y. Laporte Professor Department of Software Engineering and TI Rev 1 (2006.04.10)

  2. ‘Regardless of what we discover, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.’ Norm Kerth Project Retrospectives:A Handbook for Team Reviews Dorset House Publishing, 2001. http://www.retrospectives.com/

  3. Session Objectives • Objectives: • Understand the Business Rationale for Implementing Peer Reviews. • Understand the types of Reviews • Desk Check, • Walkthrough, • Inspection, • Quality Estimation. • Prerequisites: • Basic understanding of a Software Development or involvement in Software Development Projects 1/3/2020 3

  4. First To Market Software Development, The Demands . . . Faster! Better Quality! Cheaper!! The Challenge No Delays! More productivity! No Cost Over runs 1/3/2020 4

  5. Understand the Business Rationale for Implementing Peer Reviews. • What Does Management Want ? • Predictability • Predictable Content and Quality • Predictable Cost • Predictable Schedule • Benefits of Inspections to Management • Provide real-time hard data to help in decision-making • Completion of products • 90% completion syndrome • Measure of quality of products • Indicators of difficulties and potential improvements to software processes and products 1/3/2020 5

  6. Chaos Reports 1994 - 2002 Number of Projects Type 1 : CQFC – OK (Cost, Quality, Fonctions, Calendar) Type 2 : Projects completed, but failed CQFC Type 3 : Projects terminated ! 1/3/2020 6

  7. Injected New Gap Detected without Inspections Detected with Inspections Gap Defect Injection and Detection 25 20 15 Defects/KLOC 10 5 0 START REQ HLD LLD CODE UT IT ST SHIP Activity Source:Ron Radice, ‘Software Inspections:Past, Present, and Future.’, SoftwareTechnology Conference, Salt Lake City, Utah, May, 2001 1/3/2020 7

  8. Cost of Defect Removal Low Maturity CMM 1 Mature CMM 3 Very Mature CMM 5 Defects found Removal Cost$25$105$385$620$1150$6500 Source: B. Boehm, 1981 and C. Jones, 1990 Notes: 1. A defect found at requirement phase costs $25 to fix. If the same defect is found at Unit Testing, the cost will be $620 2. UT= Unit Test SIT/SAT= System Integration & Test/System Acceptance Test 1/3/2020 8

  9. Benefit Ratio of Implementing Inspection Before Review/Inspection After Implemented Review/Inspection 19% Reduce 31% in rework 12% Rework Effort 8% 4% 3% 1% Req. Design Code Test Post-Release Implementing Formal Review/Inspection increased design effort by 4% decreased rework effort by 31% Cost: Benefit ratio is 4% : 31% or 1 : 7.75 Source: Vu, J., ‘Software Process Improvement Journey’, 8th SoftwareEngineering Process Group ConferenceSan Jose, California March, 1997. 1/3/2020 10

  10. Percentage of Rework on Projects TRW 30% (Boehm ,1987) NASA-SEL 40% (McGarry, 1987) Hewlett-Packard 33% (Duncker, 1992) Raytheon 41% (Dion, 1993) Note: Rework = Waste or Scrap 1/3/2020 11

  11. Continuous Process Improvement 2 Managed CMMI – Staged Representation Focus Level Process Areas Quality Productivity Organizational Innovation and Deployment Causal Analysis and Resolution 5 Optimizing 4 Quantitatively Managed Organizational Process Performance Quantitative Project Management Quantitative Management Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management for IPPD Risk Management Integrated Teaming Integrated Supplier Management Decision Analysis and Resolution Organizational Environment for Integration Process Standardization 3 Defined Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management Basic Project Management Risk Rework 1 Initial Source: Software Engineering Institute, Carnegie Mellon University

  12. Continuous Process Improvement 2 Managed CMMI – Staged Representation Focus Level Process Areas Quality Productivity Organizational Innovation and Deployment Causal Analysis and Resolution 5 Optimizing 4 Quantitatively Managed Organizational Process Performance Quantitative Project Management Quantitative Management Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management for IPPD Risk Management Integrated Teaming Integrated Supplier Management Decision Analysis and Resolution Organizational Environment for Integration Process Standardization 3 Defined Reviews provide data for accessing quality and progress. Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management Basic Project Management Risk Rework 1 Initial Source: Software Engineering Institute, Carnegie Mellon University

  13. Continuous Process Improvement 2 Managed CMMI – Staged Representation Focus Level Process Areas Quality Productivity Organizational Innovation and Deployment Causal Analysis and Resolution 5 Optimizing 4 Quantitatively Managed Organizational Process Performance Quantitative Project Management Quantitative Management As shown in previous slides, reviews are effective for identifying defects and issues. Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management for IPPD Risk Management Integrated Teaming Integrated Supplier Management Decision Analysis and Resolution Organizational Environment for Integration Process Standardization 3 Defined Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management Basic Project Management Risk Rework 1 Initial Source: Software Engineering Institute, Carnegie Mellon University

  14. Objectives • Understand the Business Rationale for Implementing Peer Reviews. • Understand the types of Reviews • Desk Check, • Walkthrough, • Inspection, • Quality Estimation. Formality Spectrum Ad hoc Desk Check Walkthrough Inspection

  15. Review Differences IEEE –1028 - Standard for Software Reviews Management Technical InspectionWalkthrough Audit Review Review Objective Ensure progress Evaluate Find Anomalies Evaluate conformance compliance Number of Members Unlimited 3-6 2-7 1-5 Material Size Moderate to High Relatively low Mod- High Leadership Manager Lead eng. Trained Facilitator Lead Facilitator Author Auditor Management Yes Usually not Present ? Output Man. Report Tech Report Defect list Report Defect list 1/3/2020 16

  16. Types of ReviewsIEEE –1028 - Standard for Software Reviews • Walkthrough • Static analysis technique to evaluate a software product. • Major objectives: • Find anomalies • Improve the software product • Consider alternative implementations • Evaluate conformance to standards and specifications • Other objectives • Exchange of techniques and style variations • training of the participants. • Defined Roles • Leader, recorder, author, team member • Management position ‘over’ participants shall not participate 1/3/2020 19

  17. Types of ReviewsIEEE –1028 - Standard for Software Reviews • Inspection • A visual examination of a software product to detect and identify anomalies including errors and deviations from standards and specifications. • Peer examined, • Led by impartial and trained facilitator • Determination of remedial or investigative action for an anomaly is mandatory • Solutions are not determined during inspection meeting • Author shall not act as inspection leader • Author should not act as reader or recorder • Individuals holding management positions over any member of the inspection team shall not participate 1/3/2020 20

  18. Objectives • Understand the Business Rationale for Implementing Peer Reviews. • Understand the types of Reviews • Desk Check* • Walkthrough, • Inspection, • Quality Estimation. * Source: Wallace, D., Ippolito, L., Cuthill, B., Reference Information for the Software Verification and Validation Process, National Institute of Standards and Technology, U.S.A, Special Publication 500-234, 1996.

  19. Desk Check Review • Goal: To pass around a document and request peers to make comments and identify issues. • Author • Completes and distributes to selected reviewer(s) the document to be reviewed and: • Desk Check Review Form(s) • Checklist(s) (optional) • Reviewer(s) • Verify the document using the checklist(s) selected by the author • Fill-in the Review Form (i.e. comments and effort taken to review the document) • Forward Review Form to the author. • Author • Review the comment(s) documented on the Review Form(s) • If author agrees with all the comment(s), he incorporates them in the document. • If the author does not agree on comments, or if he thinks that comments have major impact on the document: • Calls a Review meeting. • Conducts the Review meeting to finalize the disposition of comments as follow: • Incorporate as is • Not incorporate • Incorporate with modifications • Author incorporates comments in the document. • Author logs on the Desk Check Review Form the effort required to review and correct the document.

  20. Desk Check Review Form Effort to review document : _________ Effort to correct document: _________ * Disposition: Inc : Incorporate as is; NOT : Not incorporate, MOD : Incorporate with modifications

  21. TEST Verification Purpose: To ensure that selected work products meet their specified requirements. Prepare for Verification Perform Peer Reviews DESIGN DEVELOP Verify Selected Work Products 27 Source: Software Engineering Institute, Carnegie Mellon University

  22. Verification - Introduction • Proven mechanism for effectivedefect removal. • Develop a better understanding of the work products and the processes that produced them • Defects can be prevented • Process-improvement opportunities can be identified. • Methodical examination of work products by the producers' peers to identifydefects and other changes that are needed. • Examples of peer review methods include the following: • Structured walkthrough • Inspection (e.g. Gilb, Fagan)

  23. Objectives • Understand the Business Rationale for Implementing Peer Reviews. • Understand the types of Reviews • Desk Check, • Walkthrough, • Inspection, • Quality Estimation.

  24. Walkthrough • Find defects, omissions, and contradictions; • Improve the product; • Consider alternative implementations. • Exchange of techniques, style variations, and education of the participants. • Point out • Efficiency and readability issues in the code, • Modularity issues in design specifications • Requirements testability issues. • Defect data is systematically collected and stored.

  25. Walkthrough- Overview Product .. . . . Change document requests 100 Plan the walkthrough Checklists Process improvements Request Source 150 Complete Follow-up Exit & Release 110 Conduct Kickoff Meeting 120 Conduct Document Checking 130 Conduct Logging Meeting 140 Edit Document .

  26. Walkthrough • Responsibilities. • Author. • Selects the peers to participate in the review, • Presents the product during the walkthrough meeting. • Walkthrough Team. • Review any input material prior to the walkthrough meeting • Participate during the walkthrough meeting to ensure that it meets its objective. • Recorder. • Write all comments made during the walkthrough meeting: • Errors found, questions of style, omissions, contradictions, suggestions for improvement, or alternative approaches.

  27. Walkthrough Activities • WT 100. Plan the Walkthrough Meeting • The author completes the Walkthrough Form • Request section and Planning section • Checking section - names of reviewers • WT 110. Conduct a Kick off Meeting (optional) • The author: • Determines if an overview of the product needs to be given to educate the reviewers of the product. • If overview is held: fill in Kick-off section • Distributes to the reviewers the product and all material • WT 120. Conduct Document Checking • Reviewers • Prepare for the walkthrough meeting • Examine the product • Redlines on the product prior to the walkthrough meeting. • Prepare to discuss their comments, recommendations, questions,

  28. Walkthrough Activities WT 130. Conduct the Logging Meeting • The author walks through the product • The author and the reviewers resolve defects discovered in the walkthrough • The author documents that a walkthrough was conducted on the product on the Walkthrough Form. • Checking and Logging sections • WT 140. Edit (Rework) the Product • The author reworks the product as recorded during the walkthrough. • Completes the Editing section (i.e. effort and number of errors) • WT 150. Complete Follow-up and Exit • Author: • Ensures issues and action items are tracked to closure • Completes the Walkthrough Form • Editing and closing sections ( e.g. effort to make corrections)

  29. Walkthrough - Exit • Exit Criteria • Product reworked • Walkthrough Form Completed • Output. • Product. • Walkthrough Form

  30. Objectives • Understand the Business Rationale for Implementing Peer Reviews. • Understand the types of Reviews • Desk Check, • Walkthrough, • Inspection, • Quality Estimation.

  31. Specs Template Product Rules & Standards Rules & Standards Rules & other Standards aided by checklists Document Production Process • Using Source Documents, such as standards, specification, internal process, procedures, templates, to generate a Product Document (deliverable). Source Documents 1/3/2020 38

  32. System Specs. Product SW Specs. Rules & Standards ISO 9000 Rules & other Standards IEEE Standard EN 50128 Document Production Process to Produce Specifications 1/3/2020 39

  33. Contract System Specs. Product SW Specs. List of Issues Edited Product Change Request Inspection Process Rules & Standards ISO 9000 Rules & other Standards IEEE Standard EN 50128 Inspection Process • Peers check the product, using checklists, to verify that the author used correctly the source documents to generate the product Request for Inspection 1/3/2020 40

  34. Exercise • Can you identify any ‘Defects’ in this Requirement Statement ? • “The objective is to get higher maintainability using product XYZ” 1/3/2020 41

  35. Exercise “The objective is to get higher maintainability using product XYZ” GD 1. They should be unambiguously clear to the intended reader. GD 2. They shall break down complex concepts into a set of measurable concepts. GD 3. To define 'relative' terms like 'higher' they shall specify at least two points of reference on the defined SCALE. GD 4. They shall not mix design ideas in the specification of objectives. Source: Gilb, T., Graham, D., ‘Software Inspection’, Addison Wesley, 1993. 1/3/2020 42

  36. Exercise “The objective is to get higher maintainability using product XYZ” GD 1 GD 3 GD 2 GD 4 GD 1. They should be unambiguously clear to the intended reader. GD 2. They shall break down complex concepts into a set of measurable concepts. GD 3. To define 'relative' terms like 'higher' they shall specify at least two points of reference on the defined SCALE. GD 4. They shall not mix design ideas in the specification of objectives. Source: Gilb, T., Graham, D., ‘Software Inspection’, Addison Wesley, 1993. 1/3/2020 43

  37. Inspections Versus Tests • Tests are dynamic verification of correct performance • Require equipment, test plan, procedures, and test software • Inspections do not replace tests • Error detection with inspection is about 10X less expensive • Data from IBM: cost at inspection: 1 unit cost in test: 9 X cost with customer discovery: 117 X • Data from Goddard Space Centre - NASA • Using Testing only: 5 -17 Hours per Defect • Using Inspection: 1.6 Hour per Defect, i.e. 1.1 hour to detect + .5 hour to fix each defect. ‘Inspections find defect, while testing – which usually occurs one or more development phase after the opportunity to inspect has passed – finds only the symptoms’ Priscilla Fowler, Software Engineering Institute. 1/3/2020 44

  38. Inspection - Textbook • Also called Formal Inspection, Fagan Inspection. • Gilb, T., Graham, D., ‘Software Inspection’, Addison Wesley, 1993. 1/3/2020 45

  39. ETVX Process Notation Process Name - Step Number and Step Title Activities Inputs Outputs Entry Criteria Exit Criteria Measures ETVX = Entry Task Verification eXit 1/3/2020 46

  40. Inspection Process - Overview Product .. . . . Change document requests Process 100 Plan Inspection Checklists improvements Inspection Request Rules Source 150 Complete Follow-up Exit & Release 120 Conduct Document Checking 130 Conduct Logging Meeting 140 Edit Document 110 Conduct Kickoff Meeting . Source: Holland D. Document Inspection as an Agent of Change. in "Dare to be Excellent", Eds. Alka Jarvis and Linda Hayes. Upper Saddle River, NJ: Prentice Hall PTR [ISBN 0-13-081156-4], 1998. 1/3/2020 48

  41. Inspection - Timeframe Maximum – 2 calendar weeks 3- 4 calendar days Product .. . . . document Change requests Process 100 Plan Inspection Checklists improvements Inspection Request Rules Source 150 Complete Follow-up Exit & Release 120 Conduct Document Checking 130 Conduct Logging Meeting 140 Edit Document 110 Conduct Kickoff Meeting . Source: Holland D. Document Inspection as an Agent of Change. in "Dare to be Excellent", Eds. Alka Jarvis and Linda Hayes. Upper Saddle River, NJ: Prentice Hall PTR [ISBN 0-13-081156-4], 1998. 1/3/2020 49

  42. Inspection is NOT ! • A review of the style of a work product • A review of the author • i.e. collect performance measures about the author • Design Optimization • A meeting to fix defects or discuss possible solutions • Approval of quality of design • A discussion club • A finger pointing exercise • ‘Subjective’ error identification Source: Gilb, T., Graham, D., ‘Software Inspection’, Addison Wesley, 1993. 1/3/2020 51

  43. Candidates for Inspections:Any Work Product • Procedures • Requirements • High-level design specifications • Module-level design specifications • User manuals • Plans • Individual test cases • Change Requests, Problem Reports, Fixes • Electrical designs, Proposals, Contracts, Drafts, etc. Source: Gilb, T., Graham, D., ‘Software Inspection’, Addison Wesley, 1993. 1/3/2020 52

  44. Request for Inspection Product Change .. . . . document requests (to source) Process 100 Plan Inspection Checklists improvements Inspection Request Rules Source 150 Complete Follow-up Exit & Release 120 Conduct Document Checking 130 Conduct Logging Meeting 140 Edit Document 110 Conduct Kickoff Meeting . Source: Holland D. Document Inspection as an Agent of Change. in "Dare to be Excellent", Eds. Alka Jarvis and Linda Hayes. Upper Saddle River, NJ: Prentice Hall PTR [ISBN 0-13-081156-4], 1998. 1/3/2020 53

  45. Inspection FormSection Filled by Author • Document size: • 300 words = 1 page • 100 LOC = 1 page 1/3/2020 54

  46. Product - Conduct Brainstorm Rules & Standards - ETVX Process Notation Checklist - Generic Document Exercise • Conduct an inspection of the product document titled ‘Conduct Brainstorm’ Source - Brainstorming Meeting Description Fill in the Request Section of the Inspection Form 1/3/2020 55

  47. Step 100 – Plan Inspection Product Change .. . . . document requests (to source) Process 100 Plan Inspection Checklists improvements Inspection Request Rules Source 150 Complete Follow-up Exit & Release 120 Conduct Document Checking 130 Conduct Logging Meeting 140 Edit Document 110 Conduct Kickoff Meeting . Source: Holland D. Document Inspection as an Agent of Change. in "Dare to be Excellent", Eds. Alka Jarvis and Linda Hayes. Upper Saddle River, NJ: Prentice Hall PTR [ISBN 0-13-081156-4], 1998. 1/3/2020 56

  48. Optimum Checking Rate • The most effectiveindividual speed for‘checking a documentagainst all relateddocuments’. • Not ‘reading’ speed but Correlation-Studying speed • Optimum Range = 1 Logical page per hour • Logical page = 300 words or 100 LOC per page • Failure to use it • Not effective in identifying and removing major errors • Major error • Would probably have significantly (e.g. 10X) increased costs to find and fix later in the development/maintenance process 1/3/2020 58

  49. Relationship between code inspection rate and defect density- Effectiveness 100 Defects per 1000 LOC 80 60 40 20 0 0 200 400 600 800 1000 LOC Inspected per Hour LOC= Lines of Code Source: Wiegers, Karl, ‘Software Technical Reviews: A Practical Guide’, p 76, 2001. 1/3/2020 59

  50. Inspection Type Checking Rate Logging Rate Architecture 2 – 3 Pages Per Hour (PPH) 2 - 3 PPH Requirements 2 - 3 PPH 2 - 3 PPH Preliminary Design 3 – 4 PPH 3 – 4 PPH Detailed Design 3 – 4 PPH 3 – 4 PPH Source Code 100 – 200 Lines of code Per Hour (LPH) 100 – 200 LPH Test Plan 5 – 7 PPH 5 – 7 PPH Fixes and Changes 50 – 75 LPH 50 – 75 LPH User Documentation 8 – 20 PPH 8 – 20 PPH Rate Guidelines – After ManyMonths of Utilisation Source: Radice, ‘High Quality Low Cost Software Inspections’, 2002. 1/3/2020 60

More Related