670 likes | 817 Views
A case study in UI design and evaluation for computer security. Rob Reeder January 30, 2008. Memogate : A user interface scandal !!. Overview. Task domain: Windows XP file permissions Design of two user interfaces: native XP interface, Salmon Evaluation: Which interface was better?
E N D
A case study in UI design and evaluation for computer security Rob Reeder January 30, 2008
Overview • Task domain: Windows XP file permissions • Design of two user interfaces: native XP interface, Salmon • Evaluation: Which interface was better? • Analysis: Why was one better?
Part 1: File permissions in Windows XP • File permissions task: Allow authorized users access to resources, deny unauthorized users access to resources • Resources: Files and folders • Users: People with accounts on the system • Access: 13 types, such as Read Data, Write Data, Execute, Delete
Challenges for file permissions UI design • Maybe thousands of users – impossible to set permissions individually for each • Thirteen access types – hard for a person to remember them all
Grouping to handle users • Administrators • Power Users • Everyone • Admin-defined
A problematic user grouping Xu Ari Miguel Bill Yasir Cindy Zack Group A Group B
Precedence rules • No setting = Deny by default • Allow > No setting • Deny > Allow • (> means “takes precedence over”)
Grouping to handle access types Execute 9
Moral • Setting file permissions is quite complicated • But a good interface design can help!
ProjectF The Salmon interface 12
Example task: Wesley • Initial state • Wesley allowed READ & WRITE from a group • Final state • Wesley allowed READ, denied WRITE • What needs to be done • Deny Wesley WRITE
What’s so hard? • Conceptually: Nothing! • Pragmatically: • User doesn’t know initial group membership • Not clear what changes need to be made • Checking work is hard
Learning Wesley’s initial permissions 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions 16
Learning Wesley’s group membership Bring up Computer Management interface 5 6 Click on “Users” 7 Read Wesley’s group membership Double-click Wesley Click “Member Of” 8 9 17
Changing Wesley’s permissions 10 11 Deny Write Click “Add…” 12 Click “Apply” 18
Checking work 13 14 Click “Effective Permissions” Click “Advanced” 15 16 Select Wesley View Wesley’s Effective Permissions 19
Part 2: Common security UI design problems • Poor feedback • Ambiguous labels • Violation of conventions • Hidden options • Omission errors
Problem #1: Poor feedback 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions 22
ProjectF Salmon: immediate feedback 23
Full Control Modify Read & Execute Read Write Special Permissions Problem #2: Labels (1/3) 25
Full Control Traverse Folder/Execute File List Folder/Read Data Read Attributes Read Extended Attributes Create Files/Write Data Create Folders/Append Data Write Attributes Write Extended Attributes Delete Read Permissions Change Permissions Take Ownership Problem #2: Labels (2/3) 26
ProjectF Salmon: clearer labels 27
ProjectF Salmon: better checkboxes 31
Problem #4: Hidden options 1 2 Double-click entry Click “Advanced” 3 Click “Delete” checkbox
ProjectF Salmon: All options visible 35
ProjectF Salmon: Feedback helps prevent omission errors 38
FLOCK: Summary of design problems • Feedback poor • Labels ambiguous • Omission error potential • Convention violation • Keeping options visible
Part 3: Evaluation of XP and Salmon • Conducted laboratory-based user studies • Formative and summative studies for Salmon • I’ll focus on summative evaluation
Advice for user studies • Know what you’re measuring! • Maintain internal validity • Maintain external validity
Common usable security metrics • Accuracy – with what probability do users correctly complete tasks? • Speed – how quickly can users complete tasks? • Security – how difficult is it for an attacker to break into the system? • Etc. – satisfaction, learnability, memorability
Measure the right things! • Speed is often useless without accuracy (e.g., setting file permissions) • Accuracy may be useless without security (e.g., easy-to-remember passwords)
Measurement instruments • Speed – Easy; use a stopwatch, time users • Accuracy – Harder; need unambiguous definitions of “success” and “failure” • Security – Very hard; may require serious math, or lots of hackers
Internal validity • Internal validity: Making sure your results are due to the effect you are testing • Manipulate one variable (in our case, the interface, XP or Salmon) • Control or randomize other variables • Use same experimenter • Experimenter reads directions from a script • Tasks presented in same text to all users • Assign tasks in different order for each user • Assign users randomly to one condition or other 46
External validity • External validity: Making sure your experiment can be generalized to the real world • Choose real tasks • Sources of real tasks: • Web forums • Surveys • Your own experience • Choose real participants • We were testing novice or occasional file-permissions users with technical backgrounds (so CMU students & staff fit the bill)
User study compared Salmon to XP • Seven permissions-setting tasks, I’ll discuss two: • Wesley • Jack • Metrics for comparison: • Accuracy (measured as deviations in users’ final permission bits from correct permission bits) • Speed (time to task completion) • Not security – left that to Microsoft
Study design • Between-participants comparison of interfaces • 12 participants per interface, 24 total • Participants were technical staff and students at Carnegie Mellon University • Participants were novice or occasional file permissions users
Wesley and Jack tasks Wesley task Jack task • Initial state • Wesley allowed READ & WRITE • Final state • Wesley allowed READ, denied WRITE • What needs to be done • Deny Wesley WRITE • Initial state • Jack allowed READ, WRITE, & ADMINISTRATE • Final state • Jack allowed READ, denied WRITE & ADMINISTRATE • What needs to be done • Deny Jack WRITE & ADMINISTRATE