1 / 17

Debriefing, Recommendations

Debriefing, Recommendations. CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology May 3, 2007. Outline. Post-test Questionnaire Debriefing Final Report Findings Analysis Recommendations. Post-test Questionnaire.

bernad
Download Presentation

Debriefing, Recommendations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Debriefing, Recommendations CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology May 3, 2007

  2. Outline • Post-test • Questionnaire • Debriefing • Final Report • Findings • Analysis • Recommendations

  3. Post-test Questionnaire • Purpose: collect preference information from participant • May also be used to collect background information

  4. Likert Scales Overall, I found the widget easy to use • strongly agree • agree • neither agree nor disagree • disagree • strongly disagree

  5. Semantic Differentials Circle the number closest to your feelings about the product: Simple ..3..2..1..0..1..2..3.. Complex Easy to use ..3..2..1..0..1..2..3.. Hard to use Familiar ..3..2..1..0..1..2..3.. Unfamiliar Reliable ..3..2..1..0..1..2..3.. Unreliable

  6. Free-form Questions I found the following aspects of the product particularly easy to use ________________________________ ________________________________ ________________________________

  7. First Cartoon of the Day

  8. Debriefing • Purpose: find out why the participant behaved the way they did. • Method: interview • May focus on specific behaviors observed during the test.

  9. Debriefing Guidelines • Review participant's behaviors and post-test answers. • Let participant say whatever is on their mind. • Start with high-level issues and move on to specific issues. • Focus on understanding problems, not on problem-solving.

  10. Debriefing Techniques • What did you remember? When you finished inserting an appointment did you notice any change in the information display? • Devil's advocate Gee, other people we've brought in have responded in quite the opposite way.

  11. Findings • Summarize what you have learned • Performance • Preferences

  12. Performance Findings • Mean time to complete • Median time to complete • Mean number of errors • Median number of errors • Percentage of participants performing successfully

  13. Preference Findings • Limited-choice questions • sum each answer • compute averages to compare questions • Free-form questions • group similar answers

  14. Second Cartoon of the Day

  15. Analysis • Focus on problematic tasks • 70% of participants failed to complete task successfully • Conduct a source of error analysis • look for multiple causes • look at multiple participants • Prioritize problems by criticality Criticality = Severity + Probability

  16. Recommendations (1/2) • Get some perspective • wait until a couple days after testing • collect thoughts from group of testers • get buy-in from developers • Focus on highest impact areas

  17. Recommendations (2/2) • Ignore "political considerations" for first draft • Provide short-term and long-term solutions • short-term: will not significantly affect development schedule • long-term: needed for ultimate success of product

More Related