240 likes | 438 Views
Fujifilm FinePix Viewer Usability Study. Illinois Institute of Technology Com 525, Spring 2007. Apryl Cox Ellie Felts Joe Norton Anna Wilkins. Fujifilm FinePix Viewer 5.1 Usability Study. Topics discussed Study background Product overview Participant information
E N D
Fujifilm FinePix Viewer Usability Study Illinois Institute of Technology Com 525, Spring 2007 Apryl Cox Ellie Felts Joe Norton Anna Wilkins
Fujifilm FinePix Viewer 5.1 Usability Study Topics discussed • Study background • Product overview • Participant information • Study environment and procedures • Metrics • Results and Quotes • Findings and Recommendations • Conclusion
Fujifilm FinePix Study Background • Develop procedures to study software • Collect participants’ demographics and experiences • Identify positive aspects of the software • Identify and categorize issues encountered • Create and prioritize recommendations
Participant Information • Occupations: • Director of strategic sourcing • Professor and department chair • Healthcare consultant • Retired • Senior communications analyst • Travel agent
Study Environment UTEC (Usability and Testing Evaluation Center) at IIT • 2 rooms separated by a one-way mirror • Equipment • Computer • Recording equipment and software • Video mixer • Digital camera with software Business Conference Room • 1 room conference room in a downtown professional office setting • Equipment • Computer • Recording equipment • Digital camera with software
Study Procedure Facilitator Script • Explain study and facility • Present consent form and pre-study questionnaire • Discuss "think aloud" concept • Present scenarios and tasks • Present post-study questionnaire • Discuss participant’s responses Scenarios and Tasks • 6 Tasks • Most common photo editing functions
Usability Metrics • Efficiency • Time taken to complete each task • Effectiveness • Successful/unsuccessful completion of each task • Satisfaction • Body language, user’s comments, and user’s debriefing answers
Quantitative Results Time Spent on Tasks by Each User (minutes)
Quantitative Results (continued) Users’ Success on Each Task • †Success rating • 2 = easily completed • 1 = completed with difficulty • 0 = not completed • ‡Success percentage was calculated as follows. Each task was assigned a score, 1 = easily completed, 0.5 = completed with difficulty, 0 = not completed. A user’s score for each task was added together and divided by 6, the total number of tasks.
Qualitative Results Debriefing Questionnaire • Satisfaction rating • 4 = strongly agree • 3 = agree • 2 = disagree • 1 = strongly disagree
Participant Quotes • “I would not recommend this software to anyone. I only recommend something if I had a good experience with it.” • “I hope this is a beta [version of the software] because it needs a few tweaks.” • “I think the software probably works just fine, but it doesn’t come intuitively to me.” • “If I went into a store looking for [a camera], I’d keep looking because I didn’t like that [using the software].”
Findings for Improvement and Recommendations Four Categories of Findings • Convention • Contrary to known and accepted software convention • Functionality • Difficulty with the overall tool being used • Terminology • Command labels used are unclear • Help and Documentation • Callouts or instructions are unclear or wordy
Recommendation Terminology Severity Ratings Scope
Conclusion Pros • Easy to download photos • Minor difficulty locating photos • Easy to change the brightness of the photo • Contains most of the basic editing functions Cons • Overall, the software is not easy to use • Terminology of some commands is confusing • Help text is too wordy • None of our users would recommend the software