600 likes | 800 Views
A case study in UI design and evaluation for computer security. Rob Reeder Dependable Systems Lab Carnegie Mellon University February 16, 2006. Memogate : A user interface scandal !!. Overview. Task domain: Windows XP file permissions
E N D
A case study in UI design and evaluation for computer security Rob Reeder Dependable Systems Lab Carnegie Mellon University February 16, 2006
Overview • Task domain: Windows XP file permissions • Design of two user interfaces: native XP interface, Salmon • Evaluation: Which interface was better? • Analysis: Why was one better?
Part 1: File permissions in Windows XP • File permissions task: Allow authorized users access to objects, deny unauthorized users access to objects • Objects: Files and folders • Users: People with accounts on the system • Access: 13 types, such as Read Data, Write Data, Execute, Delete
Challenges for file permissions UI design • Maybe thousands of users – impossible to set permissions individually for each • Thirteen access types – hard for a person to remember them all
Grouping to handle users • Administrators • Power Users • Everyone • Admin-defined
A problematic user grouping Xu Ari Miguel Bill Yasir Cindy Zack Group A Group B
Precedence rules • No setting = Deny by default • Allow > No setting • Deny > Allow (> means “takes precedence over”)
Moral • Setting file permissions is quite complicated • But a good interface design can help!
ProjectF The Salmon interface
Example task: Wesley • Initial state • Wesley allowed READ & WRITE from a group • Final state • Wesley allowed READ , denied WRITE • What needs to be done • Deny Wesley WRITE
What’s so hard? • Conceptually: Nothing! • Pragmatically: • User doesn’t know initial group membership • Not clear what changes need to be made • Checking work is hard
Learning Wesley’s initial permissions 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions
Learning Wesley’s group membership Bring up Computer Management interface 5 6 Click on “Users” 7 Read Wesley’s group membership Double-click Wesley Click “Member Of” 8 9
Changing Wesley’s permissions 10 11 Deny Write Click “Add…” 12 Click “Apply”
Checking work 13 14 Click “Effective Permissions” Click “Advanced” 15 16 Select Wesley View Wesley’s Effective Permissions
Part 2: Common security UI design problems • Poor feedback • Ambiguous labels • Violation of conventions • Hidden options • Omission errors
Problem #1: Poor feedback 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions
ProjectF Salmon: immediate feedback
Full control Modify Read & Execute Read Write Special Permissions Problem #2: Labels (1/3)
Full control Traverse Folder/Execute File List Folder/Read Data Read Attributes Read Extended Attributes Create Files/Write Data Create Folders/Append Data Write Attributes Write Extended Attributes Delete Read Permissions Change Permissions Take Ownership Problem #2: Labels (2/3)
ProjectF Salmon: clearer labels
Problem #3: Violating interface conventions Normally, a clicking on a checkbox only changes that checkbox –but click a checkbox in one of these sets, and some other checkboxes may also be changed – confusing!
ProjectF Salmon: better checkboxes
Problem #4: Hidden options 1 2 Double-click entry Click “Advanced” 3 Click “Delete” checkbox
ProjectF Salmon: All options visible
Problem #5: Omission errors (1/2) • Omission error: Failure to complete a necessary step in a procedure • Classic examples: • Forgetting to take your card out of the ATM after receiving cash • Forgetting to take your original after making photocopies • Omission errors are quite common in security-based tasks • Forgetting to change a default password • Forgetting to restart a service after making changes
Problem #5: Omission errors (2/2) • XP interface showed much potential for omission errors: • Users failed to make necessary changes to permissions that were hidden from view (e.g. Change Permissions and Take Ownership) • User failed to check group membership, because the group membership information was so hard to find
ProjectF Salmon: Feedback helps prevent omission errors
FLOCK: Summary of design problems • Feedback poor • Labels ambiguous • Omission error potential • Convention violation • Keeping options visible
Part 3: Evaluation of XP and Salmon • Conducted laboratory-based user studies • Formative and summative studies for Salmon • I’ll focus on summative evaluation
Advice for user studies • Know what you’re measuring! • Maintain internal validity • Maintain external validity
Common usable security metrics • Accuracy – with what probability do users correctly complete tasks? • Speed – how quickly can users complete tasks? • Security – how difficult is it for an attacker to break into the system? • Etc. – satisfaction, learnability, memorability
Measure the right thing! Keystroke dynamics analysis poses a real threat to any computer user. Hackers can easily record the sounds of users' keystrokes and obtain sensitive passwords from them. We address this issue by introducing a new typing method we call "Babel Type", in which users hit random keys when asked to type in their passwords. We have built a prototype and tested it on 100 monkeys with typewriters. We discovered that our method reduces the keystroke attack by 100%. This approach could potentially eliminate all risks associated with keystroke dynamics and increase user confidence. It remains an open question, however, how to let these random passwords authenticate the users.
Measurement instruments • Speed – Easy; use a stopwatch, time users • Accuracy – Harder; need unambiguous definitions of “success” and “failure” • Security – Very hard; may require serious math, or lots of hackers
Internal validity • Internal validity: Making sure your results are due to the effect you are testing • Manipulate one variable (in our case, the interface, XP or Salmon) • Control or randomize other variables • Use same experimenter • Experimenter reads directions from a script • Tasks presented in same text to all users • Assign tasks in different order for each user • Assign users randomly to one condition or other
External validity • External validity: Making sure your experiment can be generalized to the real world • Choose real tasks • Sources of real tasks: • Web forums • Surveys • Your own experience • Choose real participants • We were testing novice or occasional file-permissions users with technical backgrounds (so CMU students & staff fit the bill)
User study compared Salmon to XP • Seven permissions-setting tasks, I’ll discuss two: • Wesley • Jack • Metrics for comparison: • Accuracy (measured as deviations in users’ final permission bits from correct permission bits) • Speed (time to task completion) • Not security – left that to Microsoft
Study design • Between-participants comparison of interfaces • 12 participants per interface, 24 total • Participants were technical staff and students at Carnegie Mellon University • Participants were novice or occasional file permissions users
Wesley and Jack tasks Wesley task Jack task • Initial state • Jack allowed READ, WRITE, & ADMINISTRATE • Final state • Jack allowed READ, denied WRITE & ADMINISTRATE • What needs to be done • Deny Jack WRITE & ADMINISTRATE • Initial state • Wesley allowed READ & WRITE • Final state • Wesley allowed READ, denied WRITE • What needs to be done • Deny Wesley WRITE
300% improvement 43% improvement Salmon outperformed XP in accuracy Salmon Salmon XP XP
Salmon outperformed XP in accuracy p = 0.09 p < 0.0001 Salmon Salmon XP XP
Salmon did not sacrifice speed XP XP Salmon Salmon
Salmon did not sacrifice speed p = 0.35 p = 0.20 XP XP Salmon Salmon
Part 4: Analysis • What led Salmon users to better performance?