1 / 16

Quantifying Location Privacy: The Case of Sporadic Location Exposure

Quantifying Location Privacy: The Case of Sporadic Location Exposure. Reza Shokri George Theodorakopoulos George Danezis Jean-Pierre Hubaux Jean-Yves Le Boudec. The 11th Privacy Enhancing Technologies Symposium (PETS), July 2011. Mobility. Actual Trajectory. Metric. Application.

candra
Download Presentation

Quantifying Location Privacy: The Case of Sporadic Location Exposure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantifying Location Privacy: The Case of Sporadic Location Exposure Reza Shokri George Theodorakopoulos George Danezis Jean-Pierre Hubaux Jean-Yves Le Boudec The 11th Privacy Enhancing Technologies Symposium (PETS), July 2011

  2. Mobility Actual Trajectory Metric Application Reconstructed Trajectory Exposed Trajectory Attack Protection Distorted Trajectory Observation ● Assume time and location are discrete…

  3. Location-based Services • Sporadic vs. Continuous Location Exposure • Application Model Is the location exposed? 0/1 Mobility Model Actual Location of user ‘u’ at time ‘t’

  4. Protection Mechanisms Actual Trajectory ui Actual Location Observed Location hide exposed obfuscate fake anonymize Application Protection Mechanism ● Consider a given user at a given time instant

  5. Protection Mechanisms • Model Observed location of pseudonymous user u’ at time t user to pseudonym assignment ● User pseudonyms stay unchanged over time…

  6. Adversary • Background Knowledge • Stronger: Users’ transition probability between locations • Markov Chain transition probability matrix • Weaker: Users’ location distribution over space • Stationary distribution of the ‘transition probability matrix’ ● Adversary also knows the PDFs associated to the ‘application’ and the ‘protection mechanism’

  7. Adversary • Localization Attack • What is the probability that Alice is at a given location at a specific time instant? (given the observation and adversary’s background knowledge) • Bayesian Inference relying on Hidden Markov Model • Forward-Backward algorithm, Maximum weight assignment ● Find the details of the attack in the paper

  8. Location Privacy Metric • Anonymity? • How successfully can the adversary link the user pseudonyms to their identities? • Metric: The percentage of correct assignments • Location Privacy? • How correctly can the adversary localize the users? • Metric: Expected Estimation Error (Distortion) ● Justification: R. Shokri, G. Theodorakopoulos, J-Y. Le Boudec, J-P. Hubaux. ‘Quantifying Location Privacy’. IEEE S&P 2011

  9. Evaluation • Location-Privacy Meter • Input: Actual Traces • Vehicular traces in SF, 20 mobile users moving in 40 regions • Output: ‘Anonymity’ and ‘Location Privacy’ of users over time • Modules: Associated PDFs of ‘Location-based Application’ and ‘Location-Privacy Preserving Mechanisms’ ● More information here: http://lca.epfl.ch/projects/quantifyingprivacy

  10. Evaluation • Location-based Applications • once-in-a-while APP(o, Θ) • local search APP(s, Θ) • Location-Privacy Preserving Mechanisms • fake-location injection (with rate φ) • (u) Uniform selection • (g) Selection according to the average mobility profile • location obfuscation (with parameter ρ) • ρ: The number of removed low-order bits from the location identifier LPPM(φ, ρ, {u,g})

  11. Results - Anonymity

  12. Results – Location Privacy φ: the fake-location injection rate

  13. More Results – Location Privacy uniform selection 0 0.0 0.0 2 0.0 0.0 4 0.0 0.0 0 0.3 0.0 0 0.5 0.0 0 0.0 0.3 0 0.0 0.5 obfuscation fake injection hiding

  14. Conclusions & Future Work • The effectiveness of ‘Location-Privacy Preserving Mechanisms’ cannot be evaluated independently of the ‘Location-based Application’ used by the users • Fake-location injection technique is very effective for ‘sporadic location exposure’ applications • Advantage: no loss of quality of service • Drawback: more traffic exchange • The ‘Location-Privacy Meter’ tool is enhanced in order to model the applications and also new protection mechanisms, notably fake-location injection • Changing pseudonyms over time: to be added to our probabilistic framework

  15. Location-Privacy Meter (LPM):A Tool to Quantify Location Privacy http://lca.epfl.ch/projects/quantifyingprivacy

More Related