1 / 53

How to test Level-of-Service Requirements

How to test Level-of-Service Requirements. CS 577b Software Engineering II Supannika Koolmanojwong. Outline. LOS Requirements LOS Testing Definition of Done RDCR ARB TechTalk. Level of Service Requirements . Non-functional requirements. Performance Requirements. Quality Attributes.

minh
Download Presentation

How to test Level-of-Service Requirements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to test Level-of-Service Requirements CS 577b Software Engineering II Supannika Koolmanojwong

  2. Outline • LOS Requirements • LOS Testing • Definition of Done • RDCR ARB • TechTalk

  3. Level of Service Requirements Non-functional requirements Performance Requirements Quality Attributes -ilities Non-behavioral requirements Quality goals

  4. Non-functional requirements • Can be categorized into two main categories • Execution qualities - observable at run time • Evolution qualities – embodied in the static structure of the software system Security, usability Maintainability, scalability http://en.wikipedia.org/wiki/Non-functional_requirement

  5. Examples of LOS requirements • Security • Usability • Testability • Maintainability • Scalability • Backup • Compliance • Reliability • Resource constraints (speed, memory, bandwidth) • Extensibilities • Fault Tolerance • Interoperability • Portability • Stability • Response Time • Disaster Recovery • Escrow (ensure maintenance of SW) • Privacy

  6. How to test your LOS requirements • Start with how to write your LOS requirements • Make it SMART • Specific – clear, consistent and concise • Measurable • Attainable – achievable given what they know • Realizable – can be done given the constraints • Traceable – need to understand why we need this http://www.win.tue.nl/~wstomv/edu/2ip30/references/smart-requirements.pdf

  7. Specific • “The mission planning system shall support several planning environments for generating the mission plan” • Specific ? • What do you mean by several? • Have you defined “planning environment” and “mission plan”?

  8. Specific • Avoid obviously, clearly, certainly • Avoid some, several, many • Avoid etc., so on, such as • When numbers are specified, identify the units • Ensure pronouns are clearly referenced • When module A calls B, its message history file is updated • Use glossary

  9. Measurable • “The system shall produce a plan optimized for time.” • The only way to measure is to compare with an absolute optimum, which may not be available. • Should have a fixed performance against a predefined set of test cases for which the absolute optimum is known. • Think about • Can this requirement be verified? • Can the test be conducted on one site? • Can this requirement be tested in isolation?

  10. Attainable • Possible physically for the system to exhibit that requirements • “The system shall be 100% reliable and 100% available” • “The system shall have a minimum response to a query of 1 second irrespective of system load” • Must be very expensive to meet these requirements or will never be accepted • Think • Is there a theoretical solution to the problem? • Done before? Any feasibility study? • Any physical constraints ? Memory size

  11. Realizable • Achievable given what is known about the constraints • 99% reliability • But limited budget, short schedule • Think • Sufficient resources / time / budget? • Afford to manage them? • Use status, “desirable” at the beginning, after prototyping / has evidence, then update the status

  12. Traceable • Traceable to/from concepts / spec / design / implementation / test • Need to know why we need this; business justification • Verify that it is implemented

  13. Testing LOS requirements http://scaledagileframework.com/nonfunctional-requirements/

  14. Performance Testing • Speed - respond quickly enough for the intended users? • Scalability– handle the expected user load and beyond? (AKA Capacity) • Stability– stable under expected and unexpected user loads? (AKA Robustness) • Confidence – Are you sure that users will have a positive experience on go-live day? Ref: www.PerfTestPlus.com http://1facewatch.com

  15. Check list for LOS requirements • Speed • User Expectations: Experience, Psychology, Usage • System Constraints: Hardware, Network, Software • Costs: Speed can be expensive • Scalability • How many users … • Before it gets slow? Before it stops working? Will it sustain? • …do I expect today?, … do I expect before the next upgrade? • DB capacity, File Server capacity, Back-up server capacity, Data Growth rates Ref: www.PerfTestPlus.com

  16. Check list for LOS requirements • Stability • What happens if… • More users than we expect, all the users do the same thing? • A user gets disconnected, web server goes down • Too many orders for the same thing Ref: www.PerfTestPlus.com

  17. Check list for LOS requirements • Security  • Login requirements - access levels, CRUD levels  • Password requirements - length, special characters, expiry, recycling policies  • Inactivity timeouts – durations, actions • Audit  • Audited elements – what business elements will be audited?  • Audited fields – which data fields will be audited?  • Audit file characteristics - before image, after image, user and time stamp, etc • Performance  • Response times - application loading, screen open and refresh times, etc  • Processing times – functions, calculations, imports, exports  • Query and Reporting times – initial loads and subsequent loads  http://leadinganswers.typepad.com/leading_answers/2009/03/nonfunctional-requirements-minimal-checklist.html

  18. Check list for LOS requirements • Capacity  • Throughput – how many transactions per hour does the system need to be able to handle?  • Storage – how much data does the system need to be able to store?  • Year-on-year growth requirements • Availability  • Hours of operation – when is it available? Consider weekends, holidays, maintenance times, etc  • Locations of operation – where should it be available from, what are the connection requirements? • Reliability  • Mean Time Between Failures – What is the acceptable threshold for down-time? e.g. one a year, 4,000 hours  • Mean Time To Recovery – if broken, how much time is available to get the system back up again? http://leadinganswers.typepad.com/leading_answers/2009/03/nonfunctional-requirements-minimal-checklist.html

  19. Integrity  • Fault trapping (I/O) – how to handle electronic interface failures, etc  • Bad data trapping - data imports, flag-and-continue or stop the import policies, etc  • Data integrity – referential integrity in database tables and interfaces  • Image compression and decompression standards • Recovery  • Recovery process – how do recoveries work, what is the process?  • Recovery time scales – how quickly should a recovery take to perform?  • Backup frequencies – how often is the transaction data, set-up data, and system (code) backed-up?  • Backup generations - what are the requirements for restoring to previous instance(s)? • Compatibility  • Compatibility with shared applications – What other systems does it need to talk to?  • Compatibility with 3rd party applications – What other systems does it have to live with amicably?  • Compatibility on different operating systems – What does it have to be able to run on?  • Compatibility on different platforms – What are the hardware platforms it needs to work on? http://leadinganswers.typepad.com/leading_answers/2009/03/nonfunctional-requirements-minimal-checklist.html

  20. Check list for LOS requirements • Maintainability  • Conformance to architecture standards – What are the standards it needs to conform to or have exclusions from?  • Conformance to design standards – What design standards must be adhered to or exclusions created?  • Conformance to coding standards – What coding standards must be adhered to or exclusions created? • Usability  • Look and feel standards - screen element density, layout and flow, colours, UI metaphors, keyboard shortcuts  • Internationalization / localization requirements – languages, spellings, keyboards, paper sizes, etc • Documentation  • Required documentation items and audiences for each item

  21. Examples of Load Tests • Transaction response testing • Order entry must complete within 8 seconds • AP query must return results within 5 seconds • PDF attachment must upload within 5 seconds • End user experience testing • Run full-scale web load test while a subset of users logs into system to conduct normal work activity • Have a subset of end users log into the system in the middle of load test to gauge performance • Stress test • Multiple users logging into the system at one time (100 users log in at one time) • Users logging into the system very rapidly (e.g. 1 user every second) • Extended concurrency times (25 users remain logged into system running heavy transactions for an extended period) Ref: Noelle A. Stimely, Collaborate 12

  22. Load Test Criteria Identification • Test environment • Close to production as possible (ideal) • How to adjust for inconsistencies between test and production • Building environment • Testing the test environment Ref: Noelle A. Stimely, Collaborate 12

  23. Load Test Criteria Identification • Performance acceptance criteria • Service Level Agreements (SLAs) • End user expectations • Business requirements • Workload – key scenarios representing reality • Performance optimization objectives • Scalability • Range of accepted workload conditions • Resource utilization objectives Ref: Noelle A. Stimely, Collaborate 12

  24. Load Test Profile Performance Characteristics • How many users have been concurrently logged into the system at the busiest time? • How many concurrent users will be used for the test? • How will users be divided among the script tasks to be tested, i.e. test 4 actions during test and divide users among scripts – 100 for script 1, 200 for script 2, 50 for script 3 and 150 for script 4. • Number of users: the total number of concurrent and simultaneous users who access the application in a given time frame • Rate of requests: the requests received from the concurrent load of users per unit time. • Patterns of requests: A given load of concurrent users may be performing different tasks using the application. Patterns of requests identify the average load of users, and the rate of requests for a given functionality of an application. • Will user pause before re-logging in again to perform action, i.e. wait 30 seconds before logging in again? • How long will users spend on each web page of application during test (think time) • How long will the test run? Ref: Noelle A. Stimely, Collaborate 12

  25. Load Test Profile Considerations • User activity • Some transactions occur more often than others and therefore should be a larger part of the test data and scenarios. • Ramp-up time • A major component of stress on a site/application is how many and how quickly users log into the system. • Ramp-down time • Load test ramp down implies gradually stopping the threads during a load run in order to detect memory leaks and check system recovery after the application has reached a threshold.  • Think Times • The time it takes for a user to respond to an application page has a significant impact on the number of users the application can support without performance issues. • Run Length • The length of the actual load test can vary based on many factors. You should aim to run the test long enough to get a representative example of true system performance. • Workload/users • Calculating the number of users for the load test is critical in order to have a valid test which accurately forecasts performance. Ref: Noelle A. Stimely, Collaborate 12

  26. http://www.ibm.com/developerworks/webservices/library/ws-soa-nonfunctional/index.htmlhttp://www.ibm.com/developerworks/webservices/library/ws-soa-nonfunctional/index.html

  27. Compatibility Testing • Hardware platform (IBM 360, HP 9000) • Peripherals (printer, DVD) • OS (Linux, Windows) • DB (Oracle, SQL server, MySQL) • Browser (Chrome, Firefox) • Carrier (Verizon, Sprint) • Hardware (different phones)

  28. Compliance / Conformance Testing • Conform to standards • Often performed by external organization • Local / global standards

  29. Soak / Endurance / Load / Stress Testing • Test beyond the maximum ratings for a longer period of time • Test for robustness, availability, error handling, memory leaks, denial of service, response time, throughput • Spike test – suddenly increase the number of load generated

  30. Recovery Testing • How well an application is able to recover from crashes, hardware failures • Forced failure of the software • suddenly restart the computer, unplug the network cable

  31. Security Testing • 6 basic security concepts that need to be covered • Confidentiality • Protecting the disclosure of information • Integrity • Correctness, as intended • Authentication • Confirming the identify, correct label • Availability • Ready when expected • Authorization • Access control, right service to right person • Non-repudiation • Ensure that the transferred mesg has been sent and received by the parties claiming to have sent and received. Can not deny that the party was the one who sent. http://en.wikipedia.org/wiki/Security_testing

  32. Monkey Test • Unit test with no specific test • Random strings / actions

  33. Usability Testing • Measure ease of use • Goals • Learnability – how easy to accomplish a task first time • Efficiency – how much time, how many steps • Memorability – remember afterwards • Errors – how many mistakes • Satisfaction – confident, stressed ? Recommend to a friend ? • Task first, design second

  34. How to improve usability • Use representative users • Ask users to perform representative tasks • Observe and listen • Testing 5 users is typically enough

  35. Usability testing for the website • What your visitors think the first time they see your site (first impression) • Whether your visitors "get" what your site is about. • What visitors think about your site's look and feel. • Whether your visitors can read the content easily and understand what they see. • Whether your visitors can perform your set "tasks" easily, and if not, why not. • Where visitors get "stuck"... and click away. • What visitors think about the product, pricing and offer. http://www.usabilitytesting.tv/articles/9-what-is-usability-testing

  36. More usability checklists for website • http://www.usereffect.com/topic/25-point-website-usability-checklist • http://www.pantos.org/atw/35317.html

  37. Other Methods • Cognitive Walkthrough • Evaluating the user interaction of a prototype / final product. • Core-Capability Drivethrough • Benchmarking • Create/ Use standardized test materials for a specific type of design. • E.g. time to do core task, time to fix errors, time to learn applications http://en.wikipedia.org/wiki/Usability_requirements

  38. Outline • LOS Requirements • LOS Testing • Definition of Done • RDCR ARB • TechTalk

  39. Definition of Done • Checklist of conditions that must be true • Must be ready before a backlog is pulled into a sprint during sprint planning

  40. Sample of DoD Upgrade verified while keeping all user data intact. Potentially releasable build available for download  Summary of changes updated to include newly implemented features Inactive/unimplemented features hidden or greyed out (not executable) Unit tests written and green Source code committed on server Jenkins built version and all tests green Code review completed (or pair-programmed) How to Demo verified before presentation to Product Owner Ok from Product Owner http://www.scrum-breakfast.com/2012/11/sample-definition-of-done.html

  41. Sample of DoD Code produced (all ‘to do’ items in code completed) Code commented, checked in and run against current version in source control Peer reviewed (or produced with pair programming) and meeting development standards Builds without errors Unit tests written and passing Deployed to system test environment and passed system tests Passed UAT (User Acceptance Testing) and signed off as meeting requirements Any build/deployment/configuration changes implemented/documented/communicated Relevant documentation/diagrams produced and/or updated Remaining hours for task set to zero and task closed http://www.allaboutagile.com/definition-of-done-10-point-checklist/

  42. How to build a definition of done ? • Brainstorm • What are essential ? • Identify non-iteration/sprint artifacts • Which one can not be done every iteration? • Capture impediments • Identify obstacles why a particular artifact will not be available at the end of the iteration • Commitment • Get a consensus http://www.gettingagile.com/2007/10/05/building-a-definition-of-done/

  43. Things to look at • Installation build • Pass all automated tests in staging environment • Sign Off • Pass Audit • Installation Documentation Accepted by Operations • Release Notes Updated • Training Manuals Updated http://www.gettingagile.com/2007/10/05/building-a-definition-of-done/

  44. http://www.scrumalliance.org/articles/106-definition-of-done-a-referencehttp://www.scrumalliance.org/articles/106-definition-of-done-a-reference

  45. Outline • LOS Requirements • LOS Testing • Definition of Done • RDCR ARB • TechTalk

  46. RDCR ARB

  47. RDCR ARB topics • Topics to cover in your presentation & recommended time allocations • Summary of Change Sources & Resulting Changes • Progress of Prototype • Construction Plans; Transition Plan Draft • General Discussions • Risk analysis to determine course of actions

  48. RDCR ARB – Architected Agile Teams (x,y): (presentation time, total time) (8, 10) Acceptance Test Plan and Cases; Team's strong and weak points + Shaping & Overall Project Evaluation; Full test plan and cases (2, 3) OCD. System purpose; changes in current system and deficiencies, proposed new system, system boundary, and desired capabilities and goals; top-level scenarios (10,15) Prototype Update. Changes in significant capabilities (especially those with high risk if gotten wrong) Storyboard – Optional (8, 10) Architecture. Overall and detailed architecture; design if critical; COTS/reuse selections (NOT JUST CHOICES) (6, 8) Life Cycle Plan. At lease until CCD or as appropriate; Include plans for CTS Team members’ roles & responsibilities in 577b, Full Iteration Plan (5, 10) Feasibility Evidence. Focus on Risk Analysis. Traceability Matrix. Definition of Done, Metrics • Plan on 2 minutes per briefing chart, except title • Focus on changes (particularly new things) since DCR • You may vary from the above: please notify ARB board members IN ADVANCE • QFP & QMP not presented/discussed due to time constraints.

  49. RDCR ARB – NDI-intensive Teams (x,y): (presentation time, total time) (8, 10) Acceptance Test Plan and Cases; Team's strong and weak points + Shaping & Overall Project Evaluation; Full test plan and cases (2, 3) OCD. System purpose; changes in current system and deficiencies, proposed new system, system boundary, and desired capabilities and goals; top-level scenarios (10,15) Prototype Update. Changes in significant capabilities (especially those with high risk if gotten wrong) Storyboard – Optional (5,7) Architecture. Overall and detailed architecture; design if critical; COTS/reuse selections (NOT JUST CHOICES) (6, 8) Life Cycle Plan. At lease until CCD or as appropriate; Include plans for CTS Team members’ roles & responsibilities in 577b, Full Iteration Plan (5, 10) Feasibility Evidence. Focus on Risk Analysis, Traceability Matrix. Definition of Done, Metrics • Plan on 2 minutes per briefing chart, except title • Focus on changes (particularly new things) since DCR • You may vary from the above: please notify ARB board members IN ADVANCE • QFP & QMP not presented/discussed due to time constraints. 49

  50. Grading Guidelines Total = 50 points (20) Progress of your work (10) Presentation (5) Risk Management (15) Quality (C) USC-CSSE

More Related