570 likes | 796 Views
Access Control. Michelle Mazurek. Usable Privacy and Security October 29, 2009. Outline. How do people think about access control? What and when do they want to share? How do people use current standard access control mechanisms ? Discussion: System design guidelines.
E N D
Access Control Michelle Mazurek Usable Privacy and Security October 29, 2009
Outline • How do people think about access control? • What and when do they want to share? • How do people use current standard access control mechanisms? • Discussion: System design guidelines
Thinking about sharing Olson, Grudin, and Horvitz: A study of preferences for sharing and privacy Ahern et al.: Privacy patterns and considerations in online photo sharing Recent study by home security team: Attitudes, needs and practices for home data sharing
Mapping detailed preferences Olson, Grudin, Horvitz, 2005 • Overview survey: “A situation in which you or another person did not wish to share information.” • Selected 19 types of people, 40 types of files • Detail survey: 30 participants filled out the resulting grid • How comfortable sharing? 1 to 5 (or n/a) • Instantiate each type of person
Types of people • Best friend • Spouse • Parent • Sibling • Adult child • Young child • Extended family • Personal website • Salesperson • Trusted colleague • Manager • Subordinate • Corporate lawyer • Competitor • Company newsletter • People you want to impress • Other team members
Types of information • Current location • Current IM status • Buddy list • Calendar entries • Web history • Phone, e-mail • Age • Health status • Credit card number • Work in progress • Finished work • Successes, failures • Performance review • Salary • E-mail content • Social Security # • Politics/religion • New job application
Results – Sharing preferences People: High to Low • Spouse, best friend, parent • Extended family, subordinate, team members • Competitor, personal website, salesperson Info: High to Low • Work e-mail, desk phone, cell phone, age, marital status • Health status, politics/religion, work in progress • Transgression, e-mail content, credit card, SSN
Results – Variation in variance • No variance: • Always share: work e-mail address, phone # with spouse, co-workers • Always share: home phone # with spouse, children • Never share: credit card number with the public • Lots of variance: • Personal items with co-workers (age, health, etc.) • Credit card with parents, grandparents • Pregnancy status with siblings • Work documents with family members
Variation among people • Unconcerned, pragmatists, fundamentalists • Overall, restrict more than share
Clustering people and information • Manager, trusted colleague • Best friend, family • Other co-workers • Spouse • General public • E-mail content, credit card #, transgression • Failures, salary, SSN • Home, cell number; age; marital status; successes • Health, religion, politics • Work-related stuff • Work e-mail, phone #
Discussion and guidelines One size doesn’t fit all, but strong clustering implies manageable categories Recommend policy configuration at multiple precision levels Recommend statistical prediction (dynamic cluster analysis)
Privacy in online photo sharing Ahern et al., 2007 • Collected and analyzed data from 5 months of real use of Flickr with ZoneTag • 81 users, uploaded at least 40 photos each • 36,915 total photos • Followed up by interviews with users about their access control decision-making • Recruited social groups of non-technical people • 15 participants, not previous Flickr users • Provided phone, data plan, Flickr Pro; used their own SIM cards
Flickr and ZoneTag Flickr • Public vs. non-public (private, family, friends) • Public photos are searchable (text labels) ZoneTag • Cameraphone app for upload to Flickr • By default, repeat most recent privacy setting • User can modify as desired • Add tags (suggestions, quick entry) • Location tags added automatically
Does location predict privacy? 0.25 0.1 0.1 0.25 Very Private Private Typical Public Very Public • H1: Some locations are more public than average; others are more private • Calculate ratio of public photos to total photos • For all photos, then individually per location • 50% of users: fewer than half “typical” photos • 19 users: more than half “very” photos
Does location predict privacy? H2: More frequent locations are more private Calculate ratio as before; sort by frequency Found significant correlation
Does content predict privacy? • Hand-classified 1400 frequently-used tags • Person, location, place, object, event, activity • Calculate public: non-public per category • Person significantly more private than rest • Activity significantly less private than average
Additional quantitative analysis • Access setting changed after upload: 7% • Decisions are not often strongly regretted • May be related to misuse of default • Location information suppressed: 2% • Perhaps no concern about zip-level location sharing • Perhaps because users don’t know how to suppress it
Interview results Conducted after 2 weeks of system use Four major themes emerged:
Results: Security • Location – near real-time whereabouts • “If I did something to upset somebody somehow … and they knew exactly where I lived by looking at my Flickr photos, that would bother me.” • Especially with regard to children • Both content and location • Fits with quantitative analysis • Person is more private (children) • Some locations (home?) are more private
Results: Identity, Social Disclosure Identity • Image content may be unflattering to user, others • May expose private interests • Conservative family, pride parade photos Social Disclosure • If someone wasn’t invited • When friends are “doing their, uh, ‘musician things’ … don’t need any incriminating evidence.”
Results: Convenience • Using access controls requires friends, family to sign up for Flickr accounts • Sometimes it’s easier just to make it public • Sometimes users want to make certain photos available to friends of friends
Results: Location privacy • Showed participants their aggregate location data • Generally unconcerned at zip granularity • Worries about advertisers • Changes from pre-study interview • 17% said would never share • 50% would share under special circumstances • In reality, every person shared and no one suppressed
Results: Other factors • Making decisions at capture time • Unsure how photos will look on the web • Unsure about subject’s preferences • Limiting complexity • Often choose the default for minimal effort • Dissatisfaction • Unhappy with all available options • Choose the best available but remain frustrated
Discussion and guidelines • Desired privacy settings often correlate with location and with content of tags • Suggestions: • Use these patterns for prediction, recommendation, or warning of possible errors • Make aggregate disclosure visible to users • Provide social comparison (similar decisions made by friends) to reveal relevant norms • Show users how others will view the photos
Further guidelines • Decouple visibility from discoverability • Public settings so no need to register • Not searchable so disclosure may be limited • Decouple photo and location visibility • Independent location disclosure settings • Encourage a range of access control settings • Self-censoring limits usage • Maximizing public photos adds value to system owner
Exploring access control at home Mazurek et al., 2009 http://www.pdl.cmu.edu/ • Goal: Understand how people think about access control • Wanted to examine several different facets • Current practices: digital, paper • Different policy dimensions: person, location, device, presence, time of day • Additional features: Logs, reactive policy creation
Designing a user study http://www.pdl.cmu.edu/ • In-situ interviews • Recruitment via Craigslist, flyers • Limited to non-programmer households • Interview family members at home: together, then individually • Semi-structured interviews • Elicit information about people, files • Ask specific questions as jumping-off points • Continue with free-form responses
Question structure http://www.pdl.cmu.edu/ • For each dimension, start with a specific scenario • Imagine that [a friend] is in your house when you are not. What kinds of files would you (not) want them to be able to [view, change]? • Would it be different if you were also in the [house, room]? • Extend to discuss that dimension in general • Rate concern over specific policy violations: • 1 = don’t care to 5 = devastating
Data analysis 1Razavi and Iverson, 2006 http://www.pdl.cmu.edu/ • Initial rough analysis identified areas of interest; fed back into later interviews • Iterative topic and analytic hand coding – loosely based on Grounded Theory1 • Searchable database format • Combine codes to develop broader theories; revisit data as needed • Results are qualitative
Study demographics • Ages 8 to 59 • Wide range of computer skills http://www.pdl.cmu.edu/
Household devices http://www.pdl.cmu.edu/ • Common devices: Laptop, desktop, DVR, iPod, cell phone, digital camera • Minimum: 1 desktop, 1 mobile phone, 3 iPods for a family of four • Maximum: 22 devices for 3 roommates • 3 laptops, 2 desktops • 3 cell phones • Still and video cameras • Video game systems • USB sticks, memory cards, external hard drives
Four key findings http://www.pdl.cmu.edu/ People have important data to protect, and the methods they currently use don’t provide enough assurance Policies are complicated Permission and control are important Current systems and mental models are misaligned
F1: Current methods aren’t working • “If I didn’t want everyone to see them, I just had them for a little while and then I just deleted them.” http://www.pdl.cmu.edu/ • People worry about sensitive data • Many potential breaches rated as “devastating” • Almost all worry about file security sometimes • Several have suffered actual breaches • Mechanisms vary (often ad-hoc) • Encryption, user accounts (some people) • Hide sensitive files in the file system • Delete sensitive data so no one can see it
F2: Policy needs are complex shared mixed restricted http://www.pdl.cmu.edu/ • Fine-grained divisions of people and files • Public, private aren’t enough • More than friends, family, colleagues, strangers • One policy:
F2: Dimensions beyond person http://www.pdl.cmu.edu/ • Read vs. write remains important • Read-only is needed but not sufficient • Presence resonated for most • “If you have your mother in the room, you are not going to do anything bad. But if your mom is outside the room you can sneak.” • Also can provide a chance to explain • Location • People in my home are trusted • Higher level of “lockdown” when elsewhere • Device, time of day not as popular
F2: Variation across participants http://www.pdl.cmu.edu/ • Finding reasonable defaults is difficult • What is most/least private? • Sharing-oriented vs. restriction-oriented • “Basically, it’s my stuff; if I want you to have it, I’ll give it to you.” • “I don’t really have private files.... There’s nothing that I am hiding from anybody.” • Most have one “most trusted” person • Definition of “most trusted” varies widely
F3: Permission and control http://www.pdl.cmu.edu/ • People like to be asked permission • Positive response to reactive policy creation • “I’m very willing to be open with people, I think I’d just like the courtesy of someone asking me.” • Setting policy ahead of doesn’t convey control like being present and/or explicitly granting permission • If I’m present, “I can say, ‘These are the things that you could see’.” • “I can’t be giving you permission while I sleep because I am sleeping.”
F3: A-priori policy isn’t enough http://www.pdl.cmu.edu/ • Last-minute decisions • Review logs and fine-tune: • “If someone has been looking at something a lot, I am going to be a little suspicious. In general, I would [then] restrict access to that specific file.” • People want to knowwhy as well as who • “I might be worried about who else was watching.” • “From my devices they would be able to view it but not save it.”
F4: Mental models ≠ systems “If anything were to happen, ... I’m right there to say, ‘OK, what just happened?’ So I’m not as worried.” http://www.pdl.cmu.edu/ • Desktop search finds “hidden” files • Being present isn’t enough • Violations can happen too fast to prevent • Can’t necessarily monitor across the room • Files can be shared across devices; files within a device can be restricted • “In my house” maybe not a good trust proxy • Seems natural but fails after more thought
Resulting design guidelines 1Karlson et al., 2009 http://www.pdl.cmu.edu/ • Allow fine-grained control • Specification at multiple levels of granularity to support varying needs • Plan for lending devices • Limited-access, discreet guest profiles1 • Include reactive policy creation • “Sounds like the best possible scenario.” • “It would be easy access for them while still allowing me to control what they see.”
More design guidelines http://www.pdl.cmu.edu/ • Reduce or eliminate up-front complexity • “If I had to sit down and sort everything into what people can view and cannot view, I think that would annoy me. I wouldn’t do that.” • Reactive policy creation can help with this • Support iterative policy specification • Interfaces designed to help users view/change effective policy, not just rules • Include human-readable logs
Even more guidelines http://www.pdl.cmu.edu/ • Acknowledge social conventions • Requesting permission (reactive creation again) • Plausible deniability: “I don’t want people to feel that I am hiding things from them.” • Account for users’ mental models • A lot of mismatches come from incorrect analogies to physical systems • Either fit into existing models or explicitly guide users to new mental models
Access control in practice Smetters and Good, 2009 • Smetters and Good: How Users Use Access Control • Collect usage history data from 200-employee corporation • Examine Windows and Unix user groups • Examine e-mail lists • Majority of document sharing is by e-mail • Examine DocuShare usage • File permissions • Snapshots a year apart
Analyzing groups Group name List of members (users and groups) Owner, create time, modify time Who can update membership?
Results – group membership Most groups have fewer than 20 members Users participate in more groups when groups are user-defined (not administrators) Membership changes on order of months, years
Results – Group construction • Users more often add members; administrators more often “clean up” • User-defined groups are more often disorganized, duplicate • Windows groups with clear structure, naming convention • DocuShare groups with overlapping and misleading names • Unix “emergency” groups with misspelled usernames
DocuShare settings • List of ACLs to users or groups • Positive rules only (no deny rules) • Files have single owners with full rights • Policy inheritance: folder policy automatically applies to documents added to that folder • Inheritance prompt when folder settings change
DocuShare data Collected files and folders visible to 8 users 49,672 unique objects (documents or folders) Consider how many users can see each item
Results – Setting permissions • 5.2% of objects had permissions different from their parent • 3.5% of all documents; 15% of all folders • Among changed items: • 52% different list of principals • 30% different permissions for a given principal • 17% both • Claim: In general users prefer to add files to pre-set folder rather than set permissions