170 likes | 329 Views
Protecting the Privacy of Observable Behavior in Distributed Recommender Systems. Douglas W. Oard University of Maryland oard@umd.edu Anton Leuski USC-ISI leuski@isi.edu Stuart Stubblebine Stubblebine Research Labs stuart@stubblebine.com. Takeaway Points.
E N D
Protecting the Privacy of Observable Behavior in Distributed Recommender Systems Douglas W. OardUniversity of Maryland oard@umd.edu Anton LeuskiUSC-ISI leuski@isi.edu Stuart StubblebineStubblebine Research Labs stuart@stubblebine.com SIGIR Implicit Workshop
Takeaway Points • Protecting privacy could yield behavior-based evidence about the utility of information objects • A framework for thinking about behaviors that we might hope to observe • Some initial thoughts on how to architect privacy protection into recommender systems
Information Objects Recommendations Matching Rating Observations User Profile
Motivations to Provide Ratings • Self-interest • Use the ratings to improve system’s user model • Economic benefit • If a market for ratings is created • Altruism
The Problem with Self-Interest Marginal value to community Marginal cost Marginal value to rater Few Lots Number of Ratings
Solution: Reduce the Marginal Cost Marginal value to community Marginal cost Marginal value to rater Few Lots Number of Ratings
Minimum Scope Behavior Category
Minimum Scope Behavior Category
Recommending w/Implicit Feedback Estimate Rating User Ratings User Model Predicted Ratings User Observations Community Ratings User Ratings Ratings Server Predicted Observations User Model Estimate Ratings Predicted Ratings User Observations Community Observations Observations Server
Gaining Access to Observations • Observe public behavior • Hypertext linking, publication, citing, … • Policy protection • EU: Privacy laws • US: Privacy policies + FTC enforcement • Architectural assurance of privacy • Distributed architecture • Model and mitigate privacy risks
A More Secure Data Flow Behaviors IxB Item Behavior Feature Recommendation Personal Features IxF Recommendations IxR Community Features IxF
Low Entropy Attack adversary Community Features IxF IxB For user U Side information • Solution space • Read access to IxF requires minimum number of unique contributors. • Cryptographic data structure support • Controlled mixing.
Matrix Difference Attack Community Features (IxF) Matrix Difference (IxF) - (IxF)’ IxB For user U adversary User U • Solution space • Users can’t control “next hop” • Routing can hide real source and destination Community Features (IxF)’
Identity Integrity Attack Community Features (IxF) adversary Matrix Difference (IxF) - (IxF)’ IxB For user U adversary User U adversary • Solution space • Registrar service • Blinded Credentials • Attribute Membership Credentials adversary Community Features (IxF)’
Next Steps • Collaborative filtering design • Behavior and feature inventories • Behavior->Feature mapping • Recommendation algorithm • Security/System Architecture • Protection requirements • Minimize trust required of system entities • Cryptographic mechanisms