1 / 23

Feeling-based Location Privacy Protection for Location-based Services

CS587x Lecture Department of Computer Science Iowa State University Ames, IA 50011. Feeling-based Location Privacy Protection for Location-based Services. Location-based Services. Dilemma. Users have to report their locations to LBS providers

tyme
Download Presentation

Feeling-based Location Privacy Protection for Location-based Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS587x Lecture Department of Computer Science Iowa State University Ames, IA 50011 Feeling-based Location Privacy Protection for Location-based Services

  2. Location-based Services

  3. Dilemma • Users have to report their locations to LBS providers • LBS providers may abuse the collected location data

  4. Location Exposure Presents Significant Threats • Threat1: Anonymity of service use • A user may not want to be identified as the subscriber • E.g., where is the nearest • Threat2: Location privacy • A user may not want to reveal where she is • E.g., a query is sent from

  5. RESTRICTED SPACE IDENTIFICATION • A user’s location can be correlated to her identity • E.g., a location belonging to a private property indicates the user is most likely the property owner ……… • A single location sample may not be linked to an individual, but a time-series sequence will do identified • Once the user is identified • All her visits may be disclosed

  6. Location Depersonalization • Protect anonymous use of service • Cloak the service user with her neighbors • Location privacy leak • Protect location privacy • Cloak the service user with nearby footprints • Adversary cannot know who’s there when the service is requested

  7. Motivation • Privacy modeling • Users specify their desired privacy with a number K • Privacy is about personal feeling, and it is difficult for users to choose a K value • Robustness • Just ensuring each cloaking region has been visited by K people may NOT provide protection at level K • It has to do with footprints distribution

  8. OUR SOLUTION • Feeling-based modeling • A user specifies a public region • A spatial region which a user feels comfortable that it is reported as her location should she request a service inside it • The public region becomes her privacy requirement • All location reported on her behalf will be at least as popular as the public region she identifies

  9. Challenge • How to measure the privacy level of a region? • The privacy level is determined by • Number of visitors • Footprints distribution • A good measure should involve both factors

  10. Entropy • We borrow the concept of entropy • Entropy of R is computed using the number of footprints in R belonging to different users • Entropy of R is E(R) = • Its value denotes the amount of information needed for the adversary to identify the client

  11. Popularity Popularity of R is P(R) = 2E(R) Its value denotes the actual number of users among which the client is indistinguishable Popularity is a good measure of privacy More visitors – higher popularity More evener distribution – higher popularity

  12. Location Cloaking with Our Privacy Model • Sporadic LBSs • Each location update is independent • Cloaking strategy: Ensuring each reported location is a region which has a popularity no less than P(R) • Continuous LBSs • A sequence of location updates which form a trajectory • The strategy for sporadic LBSs may not work • Adversary may identify the common set of visitors

  13. P-Populous Trajectory • We should compute the popularity of cloaking boxes with respect to a common user set, called cloaking set • Only the footprints of users in the cloaking set are considered in entropy computation • Entropy w.r.t. cloaking set U is • Popularity w.r.t. U is PU(R) = 2Eu(R) • P-Populous Trajectory(PPT) • The popularity of each cloaking box in the trajectory w.r.t. a cloaking set is no less than P(R)

  14. System Structure

  15. Footprint Indexing • Grid-based pyramid structure • 4i-1 cells at level i • Cells at the bottom level keep the footprint index

  16. Trajectory Cloaking • To receive an LBS, a client needs to submit • Public region R • Travel bound B • Location updates repeatedly during her travel • In response, the server will • Generate a cloaking box for each location update • Ensure the sequence of cloaking boxes form a PPT

  17. Challenge • How to find the cloaking set? • Basic solution: Finding the users who have footprints closest to the service-user • Resolution becomes worse • There may exist another cloaking set which leads to a finer average resolution

  18. SELECTING CLOAKING SET • Observation • Popular user: Who have footprints spanning the entire travel bound B • Cloaking with popular users tends to have a fine cloaking resolution • Easy to find their footprints close to the service user no matter where she moves • Idea • Use the most popular users as the cloaking set

  19. FINDING MOST POPULAR USERS • l-popular : the user has visited all cells at level l overlapping with B • Larger l : more popular user • E.g. • u1, u2, u3 : 2-popular • u2, u3 : 3-popular • u3: 4-popular • Strategy: Sort users by the level l, and choose the most popular ones as the cloaking set

  20. Cloaking Client’s Location • Let S be the cloaking set, p be the client’s location, we cloak p in three steps • Find closest footprints to p for each user in S • Compute the minimal bounding box of these footprints, say b • Calculate PS(b) • If PS(b) < P(R), for each user find her closest footprint to p among her footprints outside b, and goto 2. • If PS(b) ≥ P(R), b is reported as the client’s location

  21. Simulation • We implement two other strategies for comparison • Naive cloaks each location independently • Plain selects cloaking set by finding footprints closest to service user’s start position • Performance metrics • Cloaking area • Protection level

  22. Experiment • Location privacy aware gateway (LPAG) • A prototype which involves location privacy protection into a real LBS system • Two software components • LBS system: Spatial messaging

  23. Conclusion • Feeling-based privacy modeling for location privacy protection in LBSs • Public region instead of K value • Trajectory cloaking • Algorithm, simulation, experiment • Future work • Investigate attacks other than restricted space identification • Observation implication attack

More Related