1 / 69

SENG 521 Software Reliability & Testing

SRE Deployment (Part 11). SENG 521 Software Reliability & Testing. Contents. Software Quality System (SQS); Software Quality Assurance (SQA) and Software Reliability Engineering (SRE) Quality, test and data plans Roles and responsibilities Effective coordination

mahala
Download Presentation

SENG 521 Software Reliability & Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SRE Deployment (Part 11) SENG 521Software Reliability & Testing

  2. Contents • Software Quality System (SQS); Software Quality Assurance (SQA) and Software Reliability Engineering (SRE) • Quality, test and data plans • Roles and responsibilities • Effective coordination • Sample quality and test plan • Defect reporting procedure

  3. Software Quality System (SQS) and Software Quality Assurance (SQA) programs Part 11 Section 1

  4. What is Reliable Software? • Reliable software products are those that run correctly and consistently, have fewer remaining defects, handle abnormal situation properly, and need less installation effort • The remaining defects should not affect the normal behaviour and the use of the software, they will not do any destructive things to system and its hardware or software environment, and rarely be evident to the users • Developing reliable software requires: • Establishing Software Quality System (SQS) and Software Quality Assurance (SQA) programs • Establishing Software Reliability Engineering (SRE) process

  5. Software Quality System (SQS) Goals: • Building quality into the software from the beginning • Keeping and tracking quality in the software throughout the software life cycle John W. Horch: Practical Guide to Software Quality Management

  6. SQS Concerns • Software quality management is the discipline that maximizes the probability that a software system will conform to its requirements, as those requirements are perceived by the user on an ongoing basis. John W. Horch: Practical Guide to Software Quality Management

  7. Software Quality Assurance (SQA) • Software quality Assurance (SQA) is a planned and systematic approach to ensure that both software process and software product conform to the established standards, processes, and procedures. • The goals of SQA are to improve software quality by monitoring both software and the development process to ensure full compliance with the established standards and procedures. • Steps to establish an SQA program • Get the top management’s agreement on its goal and support. • Identify SQA issues, write SQA plan, establish standards and SQA functions, implement the SQA plan and evaluate SQA program.

  8. SRE: Process Requirement & Architecture Design & Implementation Test Define Necessary Reliability Develop Operational Profile SRE Proc Prepare for Test Execute Test Apply Failure Data

  9. SRE: Who is involved? • Senior management • Test coordinator (manager) • Data coordinator (manager) • Customer

  10. SRE: Management Concerns • Perception and specification of a customer’s real needs. • Translation of specification into a conforming design. • Maintaining that conformity throughout the development processes. • Product and sub-product demonstrations which provide convincing indications of the product and project having met their requirements. • Ensuring that the tests and demonstrations are designed and controlled, so as to be both achievable and manageable.

  11. Roles & Responsibilities /1 • Test Coordinator (Manager): • Test coordinator is expected to ensure that every specific statement of intent in the product requirement, specification and design, is matched by a well designed (cost-effective, convincing, self-reporting, etc.) test, measurement or demonstration. • Data Coordinator (Manager) : • The Data Coordinator ensures that the physical and administrative structures for data collection exist and are documented in the quality plan, receives and validates the data during development, and through analysis and communication ensures that the meaning of the information is known to all, in time, for effective application.

  12. Roles & Responsibilities /2 • Customer or User: • Actively encouraging the making and following of detailed quality plans for the products and projects. • Requiring access to previous quality plans and their recorded outcomes before accepting the figures and methods quoted in the new plan. • Enquiring into the sources and validity of synthetics and formulae used in estimating and planning. • Appointing appropriate personnel to provide authoritative responses to queries from the developer and a managed interface to the developer. • Receiving and reviewing reports of significant audits, reviews, tests and demonstrations. • Making any queries and objections in detail and in writing, at the earliest possible time.

  13. Define Necessary Reliability Develop Operational Profile SRE Proc Prepare for Test Execute Test Apply Failure Data time Test Plan Quality Plan Data Plan SRE: Plans Requirement & Architecture Design & Implementation Test There may be many Test and Data (measurement) plans for various parts of the same project

  14. Test Plan Data Plan Quality Plans /1 • The most promising mechanisms for gaining and improving predictability and controllability of software qualities are quality plan and its subsidiary documents, including test plans and data (measurement) plans. • The creation of the quality plan can be instrumental in raising project effectiveness and in preventing expensive and time-consuming misunderstandings during the project, and at release/acceptance time. Quality Plan

  15. SRE related activities Quality Plan /2 • Quality plan and quality record, provide guidelines for carrying out and controlling the followings: • Requirement and specification management. • Development processes. • Documentation management. • Design evaluation. • Product testing. • Data collection and interpretation. • Acceptance and release processes.

  16. Quality Plan /3 • Quality planning should be made at the very earliest point in a project, preferably before a final decision is made on feasibility, and before a software development contract is signed. • Quality plan should be devised and agreed between all the concerned parties: senior management, software development management (both administrative and technical), software development team, customers, and any involved general support functions such as resource management and company-wide quality management.

  17. Data (Measurement) Plan • The data (measurement) plan prescribes: • What should be measured and recorded during a project; • How it should be checked and collated; • How it should be interpreted and applied. • Data may be collected in several ways, within the specific project and beyond it. • Ideally, there should be a higher level of data collection and application into which project data is fed.

  18. Test Plan /1 • The purpose of test plan is to ensure that all testing activities (including those used for controlling the process of development, and in indicating the progress of the project) are expected, are manageable and are managed. • Test plans are created as a subsection or as an associated document of the quality plan. • Test plans become progressively more detailed and expanded during a project. • Each test plan defines its own objectives and scope, and the means and methods by which the objectives are expected to be met.

  19. Test Plan /2 • For the software product, the test plan is usually restricted by the scope of the test: certification, feature and load test. • The plan predicts the resources and means required to reach the required levels of assurance about the end products, and the scheduling of all testing, measuring and demonstration activities. • Tests, measurements and demonstrations are used to establish that the software product satisfies the requirements document, and that each process during a development is carried out correctly and results in acceptable outcomes.

  20. Effective Coordination • Coordination among quality plan, test plans and data plans is necessary. • Effective coordination can only be introduced and practiced if the environment and supporting structures exist. • To make the coordination work, all those involved must be prepared to question and evaluate every aspect of what they are doing, and must be ready to both give and to accept suggestions and information outside their normal field of interest and authority.

  21. Coordination of Quality Plans • The coordination of quality plans includes: • Selective reuse of methods and procedures (to reduce reinvention). • Harmonization of goals and measurements. • Provision of support tools and services. • Extraction from project and product records of indications of what works and what should be avoided.

  22. Coordination of Data Plans /1 • Coordinating (or sharing) data plans between projects • A collection of data which covers more than one project and several different development routes provides opportunities to • Compare the means of production (and thus supports rational choices between them), as well as allowing • Selection of standard expectations for performance which can be used in project planning and project control.

  23. Coordination of Data Plans /2 • Coordinating (or sharing) data between organizations • Providing a wider base for evaluation. • Leading to a more general view of what is comprised in “good practice”. • Leading to a more general view of connections between working methods and their results.

  24. Coordination of Data Plans /3 • Coordination of data plans improves quantity and quality of data in the sense of: • Estimation and re-estimation of projects, both in administrative and technical terms; • Management of the project, its products, processes and resources; • Selective re-use of methods and procedures to reduce reinvention, and to benefit by experience; • Harmonization of goals and measurements across projects; • Rationalization of the provision of support tools and services;

  25. Coordination of Test Plans /1 • Uses in the management and planning of resources and environments. • Role of test plans in ensuring the applicability and testability of the design and the code. • Test plans used as a guide for those managing testing. • Test plans used as an input to quality assurance and quality control processes. • Use of test results to decide on an appropriate course of action following a testing activity.

  26. Coordination of Test Plans /2 • Test plans and test results used as an input to project management. • Reuse of the format of the test plan from one project to another. • Use of test results to identify unusual modules. • Use of test results to assess the effectiveness of testing procedures.

  27. Elements of Quality & Test Plan Part 11 Section 2

  28. Sample SQS Plan /1 • 1 Purpose • 2 Reference Documents • 3 Management • 3.1 Organization • 3.2 Tasks • 3.3 Responsibilities Based on IEEE Standard 730.1-1989

  29. Sample SQS Plan (cont’d) /2 • 4 Documentation • 4.1 Purpose • 4.2 Minimum Documentation • 4.2.1 Software Requirements Specification • 4.2.2 Software Design Description • 4.2.3 Software Verification and Validation Plan • 4.2.4 Software Verification and Validation Report • 4.2.5 User Documentation • 4.2.6 Configuration Management Plan • 4.3 Other Documentation Based on IEEE Standard 730.1-1989

  30. Sample SQS Plan (cont’d) /3 • 5 Standards, Practices, Conventions, and Metrics • 5.1 Purpose • 5.2 Documentation, Logic, Coding, and Commentary Standards and Conventions • 5.3 Testing Standards, Conventions, and Practices • 5.4 Metrics Based on IEEE Standard 730.1-1989

  31. Sample SQS Plan (cont’d) /4 • 6 Review and Audits • 6.1 Purpose • 6.2 Minimum Requirements • 6.2.1 Software Requirements Review • 6.2.2 Preliminary Design Review • 6.2.3 Critical Design Review • 6.2.4 Software Verification and Validation Review • 6.2.5 Functional Audit • 6.2.6 Physical Audit • 6.2.7 In-process Reviews • 6.2.8 Managerial Reviews • 6.2.9 Configuration Management Plan Review • 6.2.10 Postmortem Review • 6.3 Other Reviews and Audits Based on IEEE Standard 730.1-1989

  32. Sample SQS Plan (cont’d) /5 • 7 Test • 8 Problem Reporting and Corrective Action • 8.1 Practices and Procedures • 8.2 Organizational Responsibilities • 9 Tools, Techniques, and Methodologies • 10 Code Control • 11 Media Control • 12 Supplier Control • 13 Records Collection, Maintenance, and Retention • 14 Training • 15 Risk Management Based on IEEE Standard 730.1-1989

  33. Sample Test Plan /1 • 1 Test Plan identifier • 2 Introduction • 2.1 Objectives • 2.2 Background • 2.3 Scope • 2.4 References Based on IEEE Standard 829-1983

  34. Sample Test Plan (cont’d) /2 • 3 Test Items • 3.1 Program Modules • 3.2 Job Control Procedures • 3.3 User Procedures • 3.4 Operator Procedures • 4 Features To Be Tested • 5 Feature Not To be Tested Based on IEEE Standard 829-1983

  35. Sample Test Plan (cont’d) /3 • 6 Approach • 6.1 Conversion Testing • 6.2 Job Stream Testing • 6.3 Interface Testing • 6.4 Security Testing • 6.5 Recovery Testing • 6.6 Performance Testing • 6.7 Regression • 6.8 Comprehensiveness • 6.9 Constraints Based on IEEE Standard 829-1983

  36. Sample Test Plan (cont’d) /4 • 7 Item Pass/Fail Criteria • 8 Suspension Criteria and Resumption Requirements • 8.1 Suspension Criteria • 8.2 Resumption Requirements • 9 Test Deliverables • 10 Testing Tasks Based on IEEE Standard 829-1983

  37. Sample Test Plan (cont’d) /5 • 11 Environmental Needs • 11.1 Hardware • 11.2 Software • 11.3 Security • 11.4 Tools • 11.5 Publications • 12 Responsibilities • 12.1 Test Group • 12.2 User Department • 12.3 Development Project Group Based on IEEE Standard 829-1983

  38. Sample Test Plan (cont’d) /6 • 13 Staffing and Training Needs • 13.1 Staffing • 13.2 Training • 14 Schedule • 15 Risks and Contingencies • 16 Approvals Based on IEEE Standard 829-1983

  39. Reporting Procedure • Defect reporting, tracking, and closure procedure SCN: software change notice STR: software trouble report John W. Horch: Practical Guide to Software Quality Management

  40. Best Practice SRE Part 11 Section 3

  41. Practice of SRE /1 • The practice of SRE provides the software engineer or manager the means to predict, estimate, and measure the rate of failure occurrences in software. • Using SRE in the context of Software Engineering, one can: • Analyze, manage, and improve the reliability of software products. • Balance customer needs for competitive price, timely delivery, and a reliable product. • Determine when the software is good enough to release to customers, minimizing the risks of releasing software with serious problems. • Avoid excessive time to market due to overtesting.

  42. Practice of SRE /2 The practice of SRE may be summarized in six steps: • Quantify product usage by specifying how frequently customers will use various features and how frequently various environmental conditions that influence processing will occur. • Define quality quantitatively with the customers by defining failures and failure severities and by specifying the balance among the key quality objectives of reliability, delivery date, and cost. • Employ product usage data and quality objectives to guide design and implementation of the product and to manage resources to maximize productivity (i.e., customer satisfaction per unit cost). • Measure reliability of reused software and acquired software components as an acceptance requirement. • Track reliability and use this information to guide product release. • Monitor reliability in field operation and use results to guide new feature introduction, as well as product and process improvement.

  43. Incremental Implementation • Most projects implement the SRE activities incrementally. • A typical implementation sequence

  44. Implementing SRE /1 • Feasibility and requirements phase: • Define and classify failures, i.e., failure severity classes. • Identify customer reliability needs. • Determine operational profile. • Conduct trade-off studies (among reliability, time, cost, people, technology). • Set reliability objectives.

  45. Implementing SRE /2 • Design and implementation phase: • Allocate reliability among components, acquired software, hardware and other systems. • Engineer to meet reliability objectives. • Focus resources based on operational profile. • Measure reliability of acquired software, hardware and other systems, i.e., certification test. • Manage fault introduction and propagation

  46. Implementing SRE /3 • System test and field trial phase: • Determine operational profile used for testing, i.e, test profile. • Conduct reliability growth testing. • Track testing progress. • Project additional testing needed. • Certify reliability objectives and release criteria are met.

  47. Implementing SRE /4 • Post delivery and maintenance: • Project post-release staff needs. • Monitor field reliability vs. objectives. • Track customer satisfaction with reliability. • Time new feature introduction by monitoring reliability. • Guide product and process improvement with reliability measures.

  48. Feasibility Phase • Activity 1:Define and classify failures • Define failure from customer’s perspective • Group identified failures into a group of severity classes from customer’s perspective • Usually 3-4 classes are sufficient • Activity 2:Identify customer reliability needs • What is the level of reliability that the customer needs? • Who are the rival companies and what are rival products and what is their reliability? • Activity 3:Determine operational profile • Based on the tasks performed and the environmental factors

  49. Feasibility Phase: Benefits • Activity 1 and 2:Define and classify failures, identify customer reliability needs • Benefits:Release software at a time that meets customer reliability needs but is as early and inexpensive as possible • Activity 3:Determine operational profiles • Benefits:Speed up time to market by saving test time, reduce test cost, have a quantitative measure for reliability

  50. Requirements Phase • Activity 4:Conduct trade-off studies • Reliability and functionality • Reliability, cost, delivery date, technology, team • Activity 5:Set reliability objectives based on • Explicit requirement statements from a request for proposal or standard document • Customer satisfaction with a previous release or similar product • Capabilities of competition • Trade-offs with performance, delivery date and cost • Warranty, technology capabilities

More Related