1 / 17

Protecting the Privacy of Observable Behavior in Distributed Recommender Systems

Protecting the Privacy of Observable Behavior in Distributed Recommender Systems. Douglas W. Oard University of Maryland oard@umd.edu Anton Leuski USC-ISI leuski@isi.edu Stuart Stubblebine Stubblebine Research Labs stuart@stubblebine.com. Takeaway Points.

brit
Download Presentation

Protecting the Privacy of Observable Behavior in Distributed Recommender Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Protecting the Privacy of Observable Behavior in Distributed Recommender Systems Douglas W. OardUniversity of Maryland oard@umd.edu Anton LeuskiUSC-ISI leuski@isi.edu Stuart StubblebineStubblebine Research Labs stuart@stubblebine.com SIGIR Implicit Workshop

  2. Takeaway Points • Protecting privacy could yield behavior-based evidence about the utility of information objects • A framework for thinking about behaviors that we might hope to observe • Some initial thoughts on how to architect privacy protection into recommender systems

  3. Information Objects Recommendations Matching Rating Observations User Profile

  4. Motivations to Provide Ratings • Self-interest • Use the ratings to improve system’s user model • Economic benefit • If a market for ratings is created • Altruism

  5. The Problem with Self-Interest Marginal value to community Marginal cost Marginal value to rater Few Lots Number of Ratings

  6. Solution: Reduce the Marginal Cost Marginal value to community Marginal cost Marginal value to rater Few Lots Number of Ratings

  7. Some Observable Behaviors

  8. Behavior Category

  9. Minimum Scope Behavior Category

  10. Minimum Scope Behavior Category

  11. Recommending w/Implicit Feedback Estimate Rating User Ratings User Model Predicted Ratings User Observations Community Ratings User Ratings Ratings Server Predicted Observations User Model Estimate Ratings Predicted Ratings User Observations Community Observations Observations Server

  12. Gaining Access to Observations • Observe public behavior • Hypertext linking, publication, citing, … • Policy protection • EU: Privacy laws • US: Privacy policies + FTC enforcement • Architectural assurance of privacy • Distributed architecture • Model and mitigate privacy risks

  13. A More Secure Data Flow Behaviors IxB Item Behavior Feature Recommendation Personal Features IxF Recommendations IxR Community Features IxF

  14. Low Entropy Attack adversary Community Features IxF IxB For user U Side information • Solution space • Read access to IxF requires minimum number of unique contributors. • Cryptographic data structure support • Controlled mixing.

  15. Matrix Difference Attack Community Features (IxF) Matrix Difference (IxF) - (IxF)’ IxB For user U adversary User U • Solution space • Users can’t control “next hop” • Routing can hide real source and destination Community Features (IxF)’

  16. Identity Integrity Attack Community Features (IxF) adversary Matrix Difference (IxF) - (IxF)’ IxB For user U adversary User U adversary • Solution space • Registrar service • Blinded Credentials • Attribute Membership Credentials adversary Community Features (IxF)’

  17. Next Steps • Collaborative filtering design • Behavior and feature inventories • Behavior->Feature mapping • Recommendation algorithm • Security/System Architecture • Protection requirements • Minimize trust required of system entities • Cryptographic mechanisms

More Related