180 likes | 408 Views
Debriefing, Recommendations. CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology May 3, 2007. Outline. Post-test Questionnaire Debriefing Final Report Findings Analysis Recommendations. Post-test Questionnaire.
E N D
Debriefing, Recommendations CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology May 3, 2007
Outline • Post-test • Questionnaire • Debriefing • Final Report • Findings • Analysis • Recommendations
Post-test Questionnaire • Purpose: collect preference information from participant • May also be used to collect background information
Likert Scales Overall, I found the widget easy to use • strongly agree • agree • neither agree nor disagree • disagree • strongly disagree
Semantic Differentials Circle the number closest to your feelings about the product: Simple ..3..2..1..0..1..2..3.. Complex Easy to use ..3..2..1..0..1..2..3.. Hard to use Familiar ..3..2..1..0..1..2..3.. Unfamiliar Reliable ..3..2..1..0..1..2..3.. Unreliable
Free-form Questions I found the following aspects of the product particularly easy to use ________________________________ ________________________________ ________________________________
Debriefing • Purpose: find out why the participant behaved the way they did. • Method: interview • May focus on specific behaviors observed during the test.
Debriefing Guidelines • Review participant's behaviors and post-test answers. • Let participant say whatever is on their mind. • Start with high-level issues and move on to specific issues. • Focus on understanding problems, not on problem-solving.
Debriefing Techniques • What did you remember? When you finished inserting an appointment did you notice any change in the information display? • Devil's advocate Gee, other people we've brought in have responded in quite the opposite way.
Findings • Summarize what you have learned • Performance • Preferences
Performance Findings • Mean time to complete • Median time to complete • Mean number of errors • Median number of errors • Percentage of participants performing successfully
Preference Findings • Limited-choice questions • sum each answer • compute averages to compare questions • Free-form questions • group similar answers
Analysis • Focus on problematic tasks • 70% of participants failed to complete task successfully • Conduct a source of error analysis • look for multiple causes • look at multiple participants • Prioritize problems by criticality Criticality = Severity + Probability
Recommendations (1/2) • Get some perspective • wait until a couple days after testing • collect thoughts from group of testers • get buy-in from developers • Focus on highest impact areas
Recommendations (2/2) • Ignore "political considerations" for first draft • Provide short-term and long-term solutions • short-term: will not significantly affect development schedule • long-term: needed for ultimate success of product