420 likes | 564 Views
Anonymizing User Location and Profile Information for Privacy-aware Mobile Services. Masanori Mano , Yoshiharu Ishikawa Nagoya University. Outline. Background & Motivation Related Work System Framework Matching Degree Algorithm Experimental Evaluation Conclusions and Future work.
E N D
Anonymizing User Location and Profile Information for Privacy-aware Mobile Services Masanori Mano, Yoshiharu Ishikawa Nagoya University
Outline • Background & Motivation • Related Work • System Framework • Matching Degree • Algorithm • Experimental Evaluation • Conclusions and Future work
Location-Based Services (LBSs) Where is the nearest café?
Profile-Based LBSs • LBSs typically utilize user locations and map information • Finding nearby restaurants • Presenting a map around the user • Computing the best route to the destination • Use of user profiles (user’s property) can improve the quality of service • Property- and location-based services • Application areas • Mobile shopping • Mobile advertisements
Example: Mobile Advertisements • Provides local ads to mobile users • Example: Announcement of time-limited sales of nearby shops • Use of user profiles • Properties: age, sex, address, marital status, etc. • Send selected ads to appropriate person • Example: {sex: F, age: 28, has_kids: yes} • Cosmetics for women: good • Computers: maybe • Cosmetics for men: bad • Toys for kids: good Alice
Example: Mobile Advertisements Alice came to a shopping mall Mobile Ads Provider Shopping Mall Alice
Example: Mobile Advertisements Alice wanted ads Mobile Ads Provider Shopping Mall Alice
Example: Mobile Advertisements Anonymizer construct a cloaked region and send property Mobile Ads Provider Request with(sex: F, age: 28, …) Cloaked Region
Example: Mobile Advertisements Ads provider returns selected ads for Alice Mobile Ads Provider Alice
Example: Mobile Advertisements Security Camera But, Alice is the only female within the region Mobile Ads Provider Cloaked Region
Example: Mobile Advertisements Security Camera If an adversary obtains information, he can detect target user Mobile Ads Provider Get information Identify Adversary
Example Security Camera In this anonymization,the adversary can’t identify the user Mobile Ads Provider Can’tIdentify Adversary
Related Work (1) • Techniques for location anonymity are classified into two extreme types [Ling Liu, 2009] • Anonymous location services: Only consider user locations • Identity-driven location services: Also consider user identities • Our method lies between the two extremes, but considers user properties • Another dimension
Related Work (2) • k-anonymity is the most popular approach in the proposals for location anonymity • User’s location is indistinguishable from locations of at least other k -1 users • Our approach is also based on the concept of k-anonymity • Extended by considering user properties
Related Work (3) • Various approaches to anonymous location services • Casper [Mokbel+06]: The anonymizer utilize a grid-based pyramid data structure like quad-tree • PrivacyGrid [Bamba+08]: Computes cloaked region by dynamic cell expansion • XStar [Wang+09]: Intended for the problem for automobiles on road networks
System Architecture (1) • There is a service called Matchmaker between users and ads providers • Roles of Matchmaker • Maintains user & ad profiles • Matchmaking: Recommend good ads for a given ads request • Anonymization of locations and user properties Ad Matchmaker Ads Provider Ad User Ad User Ad Ads Provider User Ad
System Architecture (2) • Matchmaker is a trusted third-party server • Given an ad request, Matchmakersends anonymized request to ads providers • Use of the user’s profile/location and ad profiles • Even if some providers are untrusted, the user’s privacy is protected anonymized data raw data Matchmaker Ads provider User trusted route
User Profile • Represents the user’s properties • k : minimum population • A cloaked region should contain at least k users • l : minimum length • Minimum length of each side of a cloaked region (square) • s : distance threshold • The user wants ads within this distance • Additional attributes (e.g., age and sex) • Value ranges are specified s kusers l
Advertisement Profile • Represents properties of each advertisement • An advertisement that satisfies the following conditions should be sent • The ad area overlaps withthe user’s requesting area • Other properties (age and sex)match (overlap) the user’s properties Ad1 s Ad2
Motivation: Bad Anonymization • The cloaked region contains aged/young and male/female users • The properties of the region is vague • The ads provider has a cosmetic ad for female • The ads provider may have a question: Is it valuable to send the ad? Ads provider Age: young to agedSex: * (all) ?
Motivating Example: Good Anonymization • Good anonymization would be that the users in the cloaked region have similar properties to the target user • Matching degree is introduced as a similarity Bad Anonymization Good Anonymization different sex different age similar sex and age
Matching Degree • A matching degree is computed as the overlapped area of attribute values • Range: [0, 1] • Treated as if it were a probability value Attribute Values of Target User Overlapped Area Attribute Values of Other User Matching Degree for Spatial Attributes Matching Degree for Interval Attributes
Matching Degree Attribute of target user Target user is Dave Compared user is Alice match = 0.0 Target user is Bob Compared user is Alice match = 1.0 Target user is Alice Compared user is Bob match = 0.5
Anonymity Conditions • The cloaked region contains the target user • The region contains at least k – 1 other users • The length of each side of the region is longer than l • The matching degrees between the target user and k - 1 users are more than a certain threshold value k-1 users target user l
Anonymization Process • Consider a rectangular region centered target user • Randomly select one user as a seed from the users within the region • Compute a rectangle around the seed • If the rectangle contains at least k users with good matching degrees, anonymization is completed A F Q B E C D
Anonymization Example • Alice required ad • k = 3 • Threshold for matching degree = 0.5 Alice Joe Mike Kent Dave Mary
Anonymization Example • Alice is young woman • match = 1.0 • Mary is also young woman • match = 1.0 • Kent is young man • match = 0.5 • Joe is aged man • match = 0.0 • Dave and Mike are middle age men • match = 0.2 Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2
Anonymization Example • A region centered Alice contains Kent and Mike • We assume that Kent is selected as the seed user Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2
Anonymization Example • Compute region around Kent • Check whether anonymization is appropriate Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2
Anonymization Example • Cloaked region contains three users with good matching degrees • We can’t detect target user • Alice, Kent and Mary are young person • It is good anonymization target user is young person Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2
Experimental Evaluation • CPU 2.8GHz • RAM 512MB • Linux • Evaluation on synthetic data Experimental Settings
Threshold Values and Success Rates • Matchmaker specifies a threshold value of matching degree • Find out an appropriate threshold • Success rate is sensitive to population • Need to change threshold flexibly Containing more than or equal to k users with good matching degree (i.e. ≧threshold) is successful anonymization
Computation Time • We compare computation times of two approaches • Compute matching degrees • Does not compute matching degrees • Only consider the number of users • Computing of matching degrees takes more than twice times • We’ll try to improve algorithms of computing matching degrees
Conclusions and Future work Conclusions • Proposed an approach to anonymization for LBSs • Utilizing user profiles to specify users’ properties and anonymization preferences • Property-aware anonymization using matching degrees Future work • More experimental evaluation • Improving algorithm