1 / 18

WidT Test Strategy and Objectives

This document outlines the test strategy and objectives for the WidT project, including the stakeholders involved, test items, and chosen approaches to achieve the objectives.

bshank
Download Presentation

WidT Test Strategy and Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WidT

  2. WidT Test strategy v0.9, June 30th 2014

  3. IntroductionProject context • Project plan and schedule are available [concept version] • De-branding is finished • Migration to “new” Network Operating Centre is finished • Optimisation WidT • Fair Use Policy, Caching • Improving existing services • Authentication, Security, Portal Page [including CMS] • Adding new services • Registering, Local content • The Test Strategy needs to ensure that the implemented changes and additions are validated WidT

  4. StakeholdersWho is involved in this project? • Frank en Miryam : Owner of the service • Erik : Senior User • Kitty : Contract & Supplier Management • Laura : Legal • Eelco : Business Consultant • Hans : Program Architect • Pieter : Project Architect • Patrick : Technical Architect • Caroline : Information Analyst • Niels van Riel: Operations • Bill Gertz: ICT Security Officer • Patrick : Senior Supplier • Han : Network Operating Centre • Paul : Project Manager • Bart : Program Manager • Ruud : Test Manager WidT

  5. Test itemsTangible and testable items WidT

  6. Product Risk Analysis [PRISMA] • Quadrant I: • Security en Performance • Quadrant II: • No items! • Quadrant III: • Internet access, Providing information, Authentication, Portal Page & Confirmation Page, ISP, Optimisation, Minimisation • Quadrant IV • User Friendliness, Additional services, CMS, Behaviour “Online & Offline” Remarkable:Overall a relative low chance of failure WidT

  7. Test objectivesShow that… • security requirements are met [12]!! Need to be defined/detailed • performance is acceptable [13]!! Need to be defined/detailed • it is and remains possible to access the internet through “WidT” [1, 8, 11] • provided information is actual, accurate and unambiguous [1, 9, 10, 11] • the user can log on to the system and/of is authenticated in the correct manner [3, 11]!! Need to be defined/detailed • the technical implementation meets the requirements set to an ISP [4]!! Need to be defined/detailed • optimisation of the data throughput without impacting the “essence” (internet access, standard information) [5]!! Need to be defined/detailed • minimising the data usage without impacting the “essence” (internet access, standard information) (internet access, standard information) [6]!! Need to be defined/detailed • Changes can be performed through the CMS [2]!! is separate RFC, unknown delivery date • Added services “work as designed” and are “fit for purpose]”!! Need to be defined/detailed WidT

  8. Test strategy (1/2)What approach is chosen to achieve the objectives? Security • Approach is chosen and tuned together with the ICT security officers • Directly depends on the requirements and chosen implementation • Static [audits, reviews] • Dynamic [e.g. penetration testing] Performance • Approach is chosen and tuned together with the IT-OPS performance experts • Directly depends on the requirements • Implicit [e.g. monitoring in production] • Explicit [e.g. performance test in production]!! performance testing in testlab has no added value!! Internet Access and providing information • Regression test based on existing test script performed in testlab • Simulate End-to-End chain wrt providing information in testlab • CMS [Portal Page, Confirmation Page] • Standard information WidT

  9. Test strategy (2/2)What approach is chosen to achieve the objectives? ISP functionality • Static [audits, reviews] • Dynamic [e.g. whitelisting, blacklisting] Optimisation & minimisation of data throughput • Directly depends on the requirements and chosen implementation • Static [review implemented measures] • Dynamic [measurements in testlab and monitoring in production] CMS functionality • Regression test based on existing test script performed in testlab • Simulate End-to-End chain wrt providing information in testlab • Additional test script focused on the [Portal Page, Confirmation Page] Additional services • Directly depends on the requirements and chosen implementation • To be defined WidT

  10. Test LevelsWhich tests will be performed? • Supplier(s) • Focus on “Works as designed”Depending on size & impact: UT, UIT, ST, SIT!!: Suppliers need to be contacted to arrange this !! • Business • Focus on “Fit for purpose“Depending on size & impact: FAT, UAT, PAT, E2E-test • Focus on “meeting the requirements”Audit & review WidT

  11. Test environment • Supplier • “Own” environment • “Own” testlab [e.g. Testlab Haarlem] • NOC • Testlab [Utrecht or Amersfoort] • Business • NOC • Testlab [Utrecht or Amersfoort] • “Pilot” • Production WidT

  12. Go/No-Go WidT

  13. Test schedule & activitiesWhen to do what (or what to do when)? Planning test activities • Generic: Process focused on the ability to quickly respond • Specific: When the project decides what to do next • Optional: Plan and agree on cooperation with supplier and review Preparation& specification • Generic: Claim test environment(s), prepare regression test(s) • Specific: When requirements and deliverables are defined • Optional: Monitor supplier Execution • Generic: Check delivery, execute “smoke-test” • Specific: What is prepared and specified Sign off • Reporting: GO/NO-GO advice, test report, testware WidT

  14. Test basisWhich information is relevant for testing? Test basis • “ist”: Not documented, regression test is leading • “soll”: Changed and added requiremenst are described in RFC Presentations Use Cases Requirements should be documented in a standard way WidT

  15. Risk logWhat might have a negative impact on testing? WidT

  16. Issues • Logged in JIRA • Monitored by the test manager WidT

  17. Reporting Progress • Generic: Weakly during the team meeting • Specific: From initiation until deployment Report per stage • Supplier: Delivery includes a test report and remaining issue log • Test manager: Go/No-go advice Final report • Test manager: Final report, including testware WidT

  18. Test resources • Ruud : Test manager Test on behalf of the business • Niels : Test on behalf of Operations • Jan: Test analyst at the testlab • Caroline : Support wrt designs • Supplier: Support wrt development • Operations: Support Performance Testing • Security Officer: Support Security Testing WidT

More Related