610 likes | 863 Views
Evaluating Information Systems. November 16, 1999. Credits. Material in these slides are adapted from materials developed by Professors Roxanne Hiltz and Murray Turoff. Functionality. Functionality - Refers to the features of a system, i.e. what is can do!
E N D
Evaluating Information Systems November 16, 1999
Credits • Material in these slides are adapted from materials developed by Professors Roxanne Hiltz and Murray Turoff.
Functionality • Functionality - Refers to the features of a system, i.e. what is can do! • Task of specifying functional requirements is generally straightforward. • However, comparing a list of functions requested by the users to a list of features in a system is not necessarily a measure of how “good” the system functions. • People’s reasons for using a system differ (it may be the only way to get a job done!)
Functionality - 2 • The system may help a person do a job quicker or better (more efficiently). • Whether or not a user feels a computer is necessary for a task depends on how well the computer meets the user’s needs. • For example, if you have to send someone a large detailed table of information (spreadsheet) and the e-mail system does not handle this well, perhaps the system does not meet the user’s needs.
Functionality - 3 • There is often the perception that the more functions are provided, and the more flexibility and more complexity in the system, the better. • FALSE • For discretionary and non-discretionary users, how the functions are implemented will have a significant impact on system usability.
Usability • A more difficult construct to define. • An abstract concept, it relates to ease of use in which functionality can be accessed. • Another way to understand usability is the ease of use in which a user communicates with a system. • However, if the functionality provided is easy to use, yet the functionality does not address the task at hand, then the system is not usable. • Summary: Usability depends on characteristics of the user and characteristics of the task.
User Types & Modes • NOVICE, CASUAL, INTERMEDIARIES • EXPERIENCED • ROUTINE • FREQUENT • OPERATORS • PROBLEM SOLVERS • POWER • RESULTS: • DIFFERENT ROLES IN ONE SYSTEM • MULTIPLE INTERFACE METHODS
Why Systems Are not Used? 1. Limited functionality: if the functions provided do not match task requirements, a system will not be usable (ex.: Email system with no editor!) 2. Poor interface design (e.g., inadequate flexibility (won't take abbreviations, provide defaults, commands as well as menus, etc.) Poor consistency and integration within the system (Different parts look and work in different ways, very confusing!)
Why Systems Are not Used? - 2 3. ACCESSIBILITY: Access or Availability problems But: this interacts with motivation. 4. Start-stop hassles (losing some of the work you have done if you have to stop in the middle and do something else) 5. Poor response time 6. Poor systems dynamics 7. Inadequate training and user aid
Why Systems Are not Used? - 3 8. Poor or non-existent documentation. 9. Poor integration with other systems (non-transferability of data; negative interference with learned interface conventions) 10. COSTS - including learning time, exceed expected benefits
Nancy Goodwin • THESIS: There is no functionality without usability. • Corollary: It is not true that the more functionality, the better! "Richer" but less usable systems provide less "effective" (actual) functionality.” • “… to be truly usable a system must be compatible not only with the characteristics of human perception and action, but, and most critically, with user’s cognitive skills in communication, understanding, memory, and problem solving...”
Usability Does Matter • Providing extensive functionality is not enough. People must understand what the functions do and how to use them. • Designing for different types of users compounds the problem (e.g., novice vs. expert). • Failure to consider usability can lead to system failure. • Summary Point: Usability contributes to overall system functionality by making it accessible to users and facilitating effective use of functional capabilities.
Ways to Help in Designing Usable Systems • The Prototype Methodology • Protocol Analysis
Prototyping • As a systems development methodology, it is not in and of itself an evaluation methodology. • Rapid prototyping facilitates extensive use of protocol analysis or other methods for systematically obtaining user feedback. • Prototyping is a development methodology based on building, testing, and iteratively improving a model of a system. • Eventually the model becomes the system.
Benefits of Prototype • 1. A prototype can provide a user with tangible means of comprehending and evaluating the proposed system, and eliciting meaningful feedback. • 2. It can provide a common baseline and frame of reference so designers and users can communicate better. • 3. It provides a way for users to participate and commit to a project (reference Joint Application Design). It can generate user enthusiasm for the project.
Benefits - 2 • 4. The prototype can “get things right”, i.e.,help ensure that the system performs adequately before widespread use. • 5. Prototyping can reduce both the development life cycle time to installation and overall cost. Some estimates have been as high as a 70% reduction in cost.
Possible Disadvantages • 1. Some formal prototyping software can be expensive. However, new PC tools (e.g. Visual basic, and even PowerPoint provide inexpensive means to do screen mock-ups). • 2. If a formal prototyping software is used, the actual result may have to be re-written for a final production version. • 3. Early versions of a prototype can provide early disappointment for the user and possibly cancel the project. • By definition, this early version has only essential features. However, some users may expect to see a “finished” system.
Possible Disadvantages - 2 • 4. The Prototyping process may be difficult to manage and control. Its nature is iterative and may repeat itself multiple times. • It does not have the established phases of the traditional systems development life cycle (“waterfall model”). Establishing phases, milestones, deliverables is more difficult since user requirements are always evolving. • 5. It is difficult to prototype large information systems that must efficiently handle large amounts of data and many users.
Protocol Analysis • The “Thinking Out Loud” Method • A “protocol” is a record of a step by step procedure. In this method, one records the step by step procedures of a user “thinking out loud” while trying to use an information system. • A qualitative, direct observation method for determining usability.
WHY DO IT? • Objective: To discover the process a person goes through in solving a problem. • Uses: Learning Cognitive Processes, Developing Expert System material, evaluating interfaces.
KEY ASSUMPTION • Cognitive processes that generate verbalization are subset of those that generate behavior
CONCERNS ABOUT THE METHOD • Subjects may have incomplete knowledge of their thinking processes; therefore the record may be incomplete. • Subjects may not have an accurate understanding of the processes of which they are aware. • The thinking process may be distorted by the thinking out loud process. • Ambiguity in language may lead to miscommunications.
NECESSARY ATMOSPHERE • Honesty • No evaluation of subject • No pressure for performance • No introduction of bias • No contamination of mental process (e.g. help) • Reciprocity and Respect
STIMULUS RESPONSE METHOD • Different stimulus may produce different mental behaviors • Do you know the capital of Delaware? • Which of the three: Newark, Wilmington, or Dover is the capital? • Name the capital of Delaware.
MODES OF PROBING • Talk Aloud, Think Aloud: While information is attended. • Concurrent Probing: While in short term memory. • Retrospective Probing: After completion of the task. • Note: you want to concentrate on the first type, but not to interrupt the subject’s thinking.
PROTOCOL ANALYSIS CONDITIONS • Subject asked to verbalize what they are thinking • Subject is not being evaluated • Observer must not participate in process • Observer must not aid the subject • Subject providing knowledge of how they solve or a problem (or learn a system)
PROTOCOL ANALYSIS PROCESS I • Present the subject a written explanation of what is taking place. • Explain that this is to evaluate the system and not them • Explain you are there only to observe and can not help them. • Present in writing a task written the user terminology and the way the user would think about it.
PROTOCOL ANALYSIS PROCESS II • Total time should take between thirty minutes to an hour if no major problem encountered. • Observe and record (video, voice tape, PC interactions, notes and coding) • Only interrupt user for further verbalization if it is unclear what they are doing.
PROTOCOL ANALYSIS PROCESS III • Ask the user to describe what he or she is doing out loud. • Ask the user to go through all the terms on the screen and explain what they think they mean • Ask the user to forecast what they think an action will do • Carry out the task on the system
PROTOCOL ANALYSIS PROCESS IV • Give help only if user is at a dead end • Questions you can ask during if necessary • Why do you do/say that? • What is troubling you? • How do you know that ...? • Why do you do it this way? • Save retrospective questions for end of session.
MAJOR LIMITATION • Can not use it on a task that the user is not familiar with. • Requires training on the task first.
TYPICAL WRITTEN INSTRUCTIONS • PLEASE EXPLAIN: • What you think something means. • What you are trying to do. • What confusion or concerns you have. • What you expect to happen next. • What you don’t know the meaning of.
CODING SCHEMA FOR VERBALIZATION I • EXAMPLE I - Thinking Out Loud includes relating: • Intentions: goals, shall, will, must, have to • Cognitions: current attention situation, define, understand • Planning: If x than y • Evaluation: Yes, No, Damit, Fine • Changing conditions \ view of the problem
CODING SCHEMA FOR VERBALIZATION II • EXAMPLE II: • Surveying given information • Generating new information • Developing a hypothesis • Unsuccessful solutions • Self reference or criticism • Silence
PROTOCOL ANALYSIS OBSERVATIONS I • Verbalization occurs only 30% to 50% of the time. • Subjects cannot verbalize when: • Reading text • Doing intense cognitive activity • Making choices • Subjects have to slow down to verbalize • Subjects will tend to be more careful as a result of verbalization and observation
PROTOCOL ANALYSIS OBSERVATIONS II • Experts on a problem verbalize a lot more than non experts (double) • Experts have more difficulty verbalizing at a very detailed level with respect to the problem with which they are dealing with. • Experts usually spend more time in planning and in the restructuring of the problem.
INCREASING VERBALIZATION • 1. Hold back stimulus or encourage slowness. • 2. Segment stimulus (subtasks) • 3. Interrupt with pre-arranged signal or set point to pause • e.g. when you are ready to indicate an action, first explain what you think everything means on the screen.
OBJECTIVES FOR INTERACTIVE SYSTEMS I • Determine their understanding of terms in the interface • Understand the cause of errors or misinterpretations • Determining missing functionality or user requirements
OBJECTIVES FOR INTERACTIVE SYSTEMS II • Determining reactions to and utility of features • Determine the utility of a metaphor for learning • Determining the utility of help and guidance
PROTOCOL ANALYSIS ADVANTAGES • A lot less effort than other approaches. • Can be done with prototype or mockup before any coding. • Learning how user approaches task • Finding major mistakes in design • Can learn attitude • Rapid feedback from small samples • Also useful for understanding user task
PROTOCOL ANALYSIS REQUIREMENTS • Subjects must be representative • Three subjects for each distinctive type of user and for each different set of tasks • Instructions simple • Observe only
PROTOCOL ANALYSIS QUESTIONS • Can ask/say: • Please explain your choice. • What are you thinking? • What does that term mean? • Should not ask: • Why did you do that? • What does “append” do? • Have physical signal for interrupt
PROTOCOL ANALYSIS: HOW TO I • A one page explanation to the subject • Set of written tasks in user terms • Subject should only spend 40-60 minutes. • Categorization scheme for recording • Tape record their verbalizations
PROTOCOL ANALYSIS HOW TO II • Retrospective questionnaire for end • Retention of major concepts • Perceived utility of features/functionality • Do not try to test everything • At least three subjects on same task • Be specific about user explaining choice they are about to make
UTILITY OF MOCK UP • Ease of understanding (current knowledge) • Ambiguity in terms • Confusion generation • Loss of continuity • Developing on line help • Marketing and acceptance
UTILITY OF WORKING SYSTEM • Ease of learning (new knowledge) • Error impact & Utility of feedback • Ease of exploration • Realistic task execution • Developing final user documentation • Utility of new features (beyond current system)
BASIC QUESTIONS I • Do the terms used on the screen mean to the subject what the designer thought they would mean? • Do the alternatives presented at that point in the interaction include what the subject wishes to do? • Is the help material or the system messages understandable or relevant to the needs of the user?
BASIC QUESTIONS II • Does the subject have difficulty locating or perceiving things on the particular screen? • Does the subject utilize the sequences of operations that the designer expected to be utilized in accomplishing a given task? • Can the user utilize the interface metaphor for learning the system? • What type of errors is the user making and why?