300 likes | 421 Views
User-Controllable Security and Privacy. Norman M. Sadeh ISR - School of Computer Science Carnegie Mellon University. Privacy in Mobile & Pervasive Computing. MyCampus project over the past 7 years Piloted a number of context-aware applications on campus
E N D
User-Controllable Security and Privacy Norman M. Sadeh ISR - School of Computer Science Carnegie Mellon University
Privacy in Mobile & Pervasive Computing • MyCampus project over the past 7 years • Piloted a number of context-aware applications on campus • Privacy as a major impediment to adoption • Wikipedia’s definition of privacy: “… the ability of an individual or group to keep their lives and personal affairs out of public view, or to control the flow of information about themselves. Privacy is the ability of an individual or organization to reveal oneself selectively…” CMU/Microsoft Mindswap – Oct 2007 - Slide 2
Computational Thinking Challenge • …But lay users (and even “experts”) are not very good at defining privacy policies… • Complexity of people’s policies • “One size fits all” often doesn’t apply • Policies change over time • Poor understanding of the consequences of how one’s information will be used • Trust Engine technologies are ahead of usability research CMU/Microsoft Mindswap – Oct 2007 - Slide 3
Question • Can we develop technologies that empower users to more accurately specify their policies? • And some related questions such as: • User burden vs. accuracy • Incl. expressiveness issue • How does this change from one application to another, from one user to another? CMU/Microsoft Mindswap – Oct 2007 - Slide 4
Three Application Domains • MyCampus - Current focus: People Finder • Grey – Defining policies to control access to rooms in a building • IMBuddy – Contextual Instant Messaging CMU/Microsoft Mindswap – Oct 2007 - Slide 5
Jim’s KB Mary’s KB Jim’s PEA Mary’s PEA Jim Mary People Finder Architecture MyCampus Server • Combines GPS, GSM and WiFi • Available on cell phones and laptops • PEA = Policy Enforcing Agent • Policies represented in rule extension of OWL language CMU/Microsoft Mindswap – Oct 2007 - Slide 6
People’s Policies Are Often Varied & Complex • User’s willingness to share their location depends on: • Who is asking • When • Where they are • What they are doing • Who they are with • And more… CMU/Microsoft Mindswap – Oct 2007 - Slide 7
People Finder – Defining Rules CMU/Microsoft Mindswap – Oct 2007 - Slide 8
Users Are Not Good At Defining Policies • People Finder Application: • Lab study with 19 users • 30 queries per user CMU/Microsoft Mindswap – Oct 2007 - Slide 9
…and it’s not for lack of trying… CMU/Microsoft Mindswap – Oct 2007 - Slide 10
It’s Not Because of the Interface CMU/Microsoft Mindswap – Oct 2007 - Slide 11
Only Slight Correlation with # Rules -Total of 30 requests -Post-hoc accuracy CMU/Microsoft Mindswap – Oct 2007 - Slide 12
Only Slight Correlation with Time Spent -Total of 30 requests -Post-hoc accuracy CMU/Microsoft Mindswap – Oct 2007 - Slide 13
Some Users Realize They Can’t Get It Right Adoption Impediment CMU/Microsoft Mindswap – Oct 2007 - Slide 14
Other users User Interface Policy Support Agent Learning Explanation Meta- Control Dialog Pervasive Computing Environments Policy Engine(s) Policies Credentials Organization (incl. policies) Resource (incl. policies) Project Focus Legend: Approach CMU/Microsoft Mindswap – Oct 2007 - Slide 15
Importance of Feedback - Notifications PeopleFinder application CMU/Microsoft Mindswap – Oct 2007 - Slide 16
Feedback – Summaries IMBuddy Application - Courtesy: Jason Hong CMU/Microsoft Mindswap – Oct 2007 - Slide 17
IMBuddy Evaluation • Usefulness of bubble notification, 1.6 (σ=0.6) • Scale of 1 to 5, where 1=strongly agree that it was useful, 3=neutral, 5=srongly disagree CMU/Microsoft Mindswap – Oct 2007 - Slide 18
Explanation Feedback Through Audit Logs CMU/Microsoft Mindswap – Oct 2007 - Slide 19
Machine Learning • Audited Logs can be used to refine a user’s policies CMU/Microsoft Mindswap – Oct 2007 - Slide 20
Lab Study CMU/Microsoft Mindswap – Oct 2007 - Slide 21
More Recent Pilots – 12 most active target users 3 Pilots – total of over 60 participants User-Defined Rules: 79% vs. ML: 91% Note: Includes benefits of auditing CMU/Microsoft Mindswap – Oct 2007 - Slide 22
Ongoing Work • Learning is a “black box” technology • Users are unlikely to understand the policies they end up with • Can we develop technology that incrementally suggests policy changes to users? • Tradeoff between rapid convergence and maintaining policies that users can relate to CMU/Microsoft Mindswap – Oct 2007 - Slide 23
Policy Evolution CMU/Microsoft Mindswap – Oct 2007 - Slide 24
Other Promising Approaches • Visualization Techniques • Explanations & dialogues CMU/Microsoft Mindswap – Oct 2007 - Slide 25
My colleagues can see my location on weekdays between 8am and 5pm Jane and Eric are late for our meeting. Show me where they are! Why couldn’t Bob see where I was? Bob is a colleague. So far only your friends can see where you are What if my colleagues could see my location too? In the past you denied access to your colleague Steve OK, make it just my superiors Overall Vision New Technology Jane Policy Visualization Policy Creation Bob Policy Enforcing Engines Policy Enforcement Jane is in Oakland but I can’t access Eric’s location Bob’s Phone Eric Explanation Policy Auditing &Refinement Dialog Eric Learning from the past Time CMU/Microsoft Mindswap – Oct 2007 - Slide 26
Some of the Things We’ve Learned So Far • Adoption will depend on whether users feel they have adequate control over the disclosure of their contextual information • People often have rather complex privacy preferences • People are not good at specifying their policies • Not easy to identify good default policies beyond just denying all requests • Policies tend to become more complex as users grow more sophisticated • Allowing more requests but in an increasingly selective way • Auditing is critical • Learning, explanation & dialogs appear promising • Applies to both privacy and security policies CMU/Microsoft Mindswap – Oct 2007 - Slide 27
Q&A Come & check out our poster this evening CMU/Microsoft Mindswap – Oct 2007 - Slide 28
Some References • User-Controllable Security and Privacy Project: http://www.cs.cmu.edu/%7Esadeh/user_controllable_security_and_privacy.htm • Norman Sadeh, Fabien Gandon and Oh Buyng Kwon, “Ambient Intelligence: The MyCampus Experience”, Chapter in "Ambient Intelligence and Pervasive Computing", Eds. T. Vasilakos and W. Pedrycz, ArTech House, 2006. (Also available as Tech. Report CMU-ISRI-05-123, School of Computer Science, Carnegie Mellon University) - http://www.cs.cmu.edu/%7Esadeh/Publications/More%20Complete%20List/Ambient%20Intelligence%20Tech%20Report%20final.pdf • Jason Cornwell, Ian Fette, Gary Hsieh, Madhu Prabaker, Jinghai Rao, Karen Tang, Kami Vaniea, Lujo Bauer, Lorrie Cranor, Jason Hong, Bruce McLaren, Mike Reiter, Norman Sadeh, "User-Controllable Security and Privacy for Pervasive Computing", Proceedings of the 8th IEEE Workshop on Mobile Computing Systems and Applications (HotMobile 2007), February 2007. http://www.cs.cmu.edu/%7Esadeh/Publications/More%20Complete%20List/HotMobile2007-user-controllable-security-privacy%20submitted%20FINAL.pdf • M. Prabaker, J. Rao, I. Fette, P. Kelley, L. Cranor, J. Hong, and N. Sadeh, "Understanding and Capturing People's Privacy Policies in a People Finder Application", 2007 Ubicomp Workshop on Privacy, Austria, Sept. 2007 CMU/Microsoft Mindswap – Oct 2007 - Slide 29
Acknowledgements • Collaborators: • Faculty: L. Bauer, L Cranor, J. Hong, B. McLaren, M. Reiter, P. Steenkiste • Post-Docs & Students: P. Drielsma, M. Prabaker, J. Rao, I. Fette, P. Kelley, K. Vaniea, R. Reeder, A Sardinha, J. Albertson, D. Hacker, J. Pincar, M. Weber. • The work presented in these slides is supported in part by NSF Cyber Trust grant CNS-0627513 and ARO research grant DAAD19-02-1-0389 ("Perpetually Available and Secure Information Systems") to Carnegie Mellon University's CyLab. CMU/Microsoft Mindswap – Oct 2007 - Slide 30