1 / 35

Privacy Interfaces for Location Sharing: when is too much transparency a bad thing?

Privacy Interfaces for Location Sharing: when is too much transparency a bad thing?. Blaine Price. The PRiMMA Team. Bashar Nuseibeh Yvonne Rogers Arosha Bandara Clara Mancini Lukasz Jedrzejczyk Keerthi Thomas. Morris Slowman Alessandra Russo Emil Lupu Naranker Dulay

miller
Download Presentation

Privacy Interfaces for Location Sharing: when is too much transparency a bad thing?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy Interfaces for Location Sharing: when is too much transparency a bad thing? Blaine Price

  2. The PRiMMA Team... Bashar Nuseibeh Yvonne Rogers Arosha Bandara Clara Mancini Lukasz Jedrzejczyk Keerthi Thomas Morris Slowman Alessandra Russo Emil Lupu NarankerDulay DomenicoCorapi Ryan Wishart Adam Joinson

  3. Outline • Methods for understanding mobile privacy • Some experiments on Methodology • A Location Tracking User Study with Families • A Privacy Feedback Interface User Study • Some Lessons Learned, Future Directions • Questions/Discussion/Watch Videos...

  4. The Problem With Asking People About Privacy is... But it is (largely) not their fault • People are bad at judging the future value of their privacy • People are bad at judging how they will react to using new systems they have never used before

  5. New Methods for Studying Mobile Privacy Behaviour • Privacy Practices of Mobile Facebook Users • Experience Sampling with Memory Phrase • Found Mobile Contexts were multi-faceted, including individual perceptions, and different physical/virtual world interconnections • Places, defined by emerging social cultural knowledge, are a major determinant of privacy needs, so effort at defining rules for automation can be based on a location/context pairing

  6. Studying Privacy Behaviour with Future Technologies • People find it difficult to imagine privacy risks in a technology they have never used • Video (and written scenarios) can be powerful tools to allow participants to immerse themselves in unfamiliar tech and vicariously experience it • Corporate Concept Videos • Videos • Apple Knowledge Navigator 1988 • Microsoft Future Healthcare 2008

  7. Contravision • Semi-Professional video of a 6 scene story shot twice with subtle difference between versions • In one version, protagonist has positive attitude and everything works perfectly • Alternative verison, protagonist has negative attitude, technology has problems • Participants watch one version without being told which one • Wider range of privacy concerns elicited

  8. User Study 1: Tracking in Families Why Familes? tracking and relationships www.vanityfair.com/culture/features/2009/12/addams-family-200912

  9. Some implicit assumptions close knit tracking is safer news.bbc.co.uk/cbbcnews/hi/newsid_6540000/newsid_6549400/6549495.stm

  10. Assumptions... the tracked is the vulnerable one imgfave.com/search/someone%20deleted%20my%20tag

  11. Assumptions... Enough controls will set you free en.wikipedia.org/wiki/File:STSCPanel.jpg

  12. Study Design F1 F2 Mother & father Son & friend Daughter Mother & father Daughter1 & partner Daughter2 & boyfriend Daughter3 3 weeks, 1092 unique tracking events

  13. Interface examples

  14. Conditions and Questions jordanhoffman.com/2007/03/08/das-experiment-2001-oliver-hirschbiegel-b complete visibility run/chase?

  15. Findings www.essex.police.uk/news_features/other_stories/victim_of_a_stalker.aspx “I don’t think it makes me a better person checking up on people…so I try to stay out of it as I possibly can as it’s none of my business, but this technology makes it a little bit too easy, you are only human”

  16. Conditions and Questions tracking tasks uneasy/keen?

  17. Findings delegating responsibility “It was easier to [check on others] when I was asked to do so…I didn’t feel like it was my responsibility…it wasn’t me doing it, I was just carrying out a task”

  18. Conditions and Questions real-time feedback reassured/deterred?

  19. Findings questioning others’ motives “I would prefer not to know, or I would have to start asking myself why they are checking on me…have I done something wrong…are they after something?”

  20. Unexpectedness and destabilisation F1-Boyfriend F1-Daughter2 “I looked him up…he was last tracked four hours ago…I thought oh did he turn his phone off…or did he turn the tracker off and why did he do that… why didn’t he tell me? Who is he with? And why is he there?”

  21. Closeness and control F2-Son F2-Mother “At times I’d rather she didn’t track me…but I wouldn’t use privacy preferences [as] I know that would hurt her feelings”

  22. Breach and predation “I feel because [they] have shown in the past that [they] are not trustworthy, that kind of [they] started it…it means I can check up on [them] and not feel too bad…[they] are unlikely to do what [they] say the do, so anything you see [on the tracker] is probably just going to confirm that”

  23. Conclusions Close knit tracking is not safer Tracking affects both parties The closer the less in control

  24. Inspiration for Study 2 • Pervasive systems allow collection of data about individuals • Increasing proliferation and dependence on these systems • Threats to privacy Users informed and manually manage privacy settings • Impractical ! Too many controls.. Pervasive systems need automated mechanisms for governing privacy behaviours that, • Reflect user preferences, • Adapt to changes in circumstances, context and user behaviours

  25. Goals How can we learn policies from user behaviours? Develop automated mechanisms for governing privacy behaviours • Reflect user preferences, • Adapt to changes in circumstances, context and user behaviours • Use Social Translucence as Privacy Interface mechanism • Example:Privacy Management for Location Based Applications • Share location information with appropriate precision to the appropriate people, at the appropriate times • Provide awareness of information access.

  26. Challenge • Requirements: • User should be able to understand and modify the learned rules. • Existing structured information and constraints should be used. • Rules should be learned incrementally. • Learned rules should be revisable (minimally) depending on changing conditions. User Agent Phone call Current location … Current credit Build a system capable of learning mobile privacy policies enforceable on mobile devices to reduce intervention in privacy management.

  27. Scenario: Real-time Feedback for LBS Location (GPS, name) Co-location Social Network Context, Constraints • We can learn sets of policy rules: • Concept invention • Recursion • Inter-related concepts • Tailor the search (customised search heuristics) • Use probabilities to handle noise in the data New /Revised Rules E+ In meeting, Alice requests E+ “Show location lookup using {mode}if {context}” Driving, Bob requests E- E+ Web browsing, Alice requests E+ E+

  28. User Study 2: Automatic Privacy Feedback Preference Detection • Surveys, Interviews and Focus groups to find range of Privacy Feedback interfaces: • Dialog Box, Toast, Notification Bar, Vibration, LED, Sound, Flashlight, Natural Language • New version of Buddy Track pre-seeded with context sensitive rules on which methods to use • Feedback interface using experience sampling to teach engine which policies are correct

  29. How it works: Example 1 2

  30. Scenario: Real-time Feedback for LBS Learning new user behavior rules…. Day 1 07:30 07:00 H H H Lookup from At home At location Context: in_group(alice, home).
 in_group(bob, home). 
 happens(gps(57,10),07:00).
 at_location(home, W, N) ← Conditions,….phone_position(in_hand) ← Conditions,…. Examples: not do(rtf_toast(alice), 07:00).
 do(rtf_toast(bob), 07:30).
 do(rtf_toast(bob), 11:00). New policy: do(rtf_toast(Call_Id, From), T ) ← phone_position(in_hand) ∧ T ≥ 07:30

  31. Scenario: Real-time Feedback for LBS …. and revising existing rules incrementally Day 2 07:30 H C C H F CF H H Lookup from At home At home At Imperial At location Near desktop Near device Context: ……….. do(rtf_toast(Call_Id, From), T ) ← phone_position(in_hand) ∧T ≥ 07:30 Revised policy: do(rtf_toast(Req_Id, From), T ) ← phone_position(in_hand) ∧ T ≥ 07:30∧ in_group(From, college) do(rtf_toast(Req_Id, From), T ) ← phone_position(in_hand) ∧ T ≥ 07:30 ∧ ¬holdsAt(location(Imperial)), T )

  32. Using privacy policies Context … in_group(charles, college). …. Querying rules…. Privacy policies; do(rtf_toast(Call_Id, From), T ) ← phone_position(in_hand) ∧T ≥ 07:30 ∧ in_group(From, college). do(rtf_toast(CallId, From), T ) ← phone_position(in_hand) ∧T ≥ 07:30 ∧ ¬holdsAt(status(bluetooth_near(desktop)), T ). 2) do(rtf_toast(charles),T) ? YES 1) Location request made 3) Toast notification of location lookup

  33. User Study & Findings 15 participants, loosely knit groups, 3 weeks 2 phases (learning, evaluation) Predefined rules used in the phase 1 Interviews Increased accuracy (appropriateness of notifications). Greater Trust and Comfort. Awareness of notifications contributes towards the acceptance of the technology. “Invisibility” of the interface.

  34. Summary • State-of-the-art learning system able to learn • Properties about the data, even if not directly observed • Recursive and inter-dependent policies • (Revised) policies from negative and positive examples • Incrementally • Minimal revisions of existing policies • Demonstrated utility of learning for adaptive awareness for privacy management.

  35. Future Directions • Domain-specific heuristics • Apply to other lifelogging domains • Application to (large) real data sets • Scalability and efficiency • Privacy Interface Controls too complicated • Apply learning to data from groups of users to derive ‘default’ privacy configurations • Learn privacy threats and requirements

More Related