740 likes | 1.08k Views
Usable Privacy and Security. Quick Discussion – based on: http://cups.cs.cmu.edu/courses/ups-sp08/. Unusable security & privacy. Unpatched Windows machines compromised in minutes Phishing web sites increasing by 28% each month Most PCs infected with spyware (avg. = 25)
E N D
Usable Privacy and Security Quick Discussion – based on: http://cups.cs.cmu.edu/courses/ups-sp08/
Unusable security & privacy • Unpatched Windows machines compromised in minutes • Phishing web sites increasing by 28% each month • Most PCs infected with spyware (avg. = 25) • Users have more passwords than they can remember and practice poor password security • Enterprises store confidential information on laptops and mobile devices that are frequently lost or stolen
Grand Challenge “Give end-users security controls they can understandand privacy they can control forthe dynamic, pervasive computing environments of the future.” - Computing Research Association 2003
After installing all that security and privacy software… Do you have any time left to get any work done?
Concerns may not be aligned “Users do not want to be responsible for, nor concern themselves with, their own security.” - Blake Ross • Security experts are concerned about the bad guys getting in • Users may be more concerned about locking themselves out
Typical password advice • Pick a hard to guess password • Don’t use it anywhere else • Change it often • Don’t write it down
Bank = b3aYZ Amazon = aa66x! Phonebill = p$2$ta1
How can we make secure systems more usable? • Make it “just work” • Invisible security • Make security/privacy understandable • Make it visible • Make it intuitive • Use metaphors that users can relate to • Train the user
One way to make it work: make decisions • Developers should not expect users to make decisions they themselves can’t make
“Present choices, not dilemmas” - Chris Nodder (in charge of user experience for Windows XP SP2)
Issues to consider • Privacy is a secondary task • Users of privacy tools often seek out these tools due to their awareness of or concern about privacy • Even so, users still want to focus on their primary tasks • Users have differing privacy concerns and needs • One-size-fits-all interface may not work • Most users are not privacy experts • Difficult to explain current privacy state or future privacy implications • Difficult to explain privacy options to them • Difficult to capture privacy needs/preferences • Many privacy tools reduce application performance, functionality, or convenience
Case study: Tor • Internet anonymity system • Allows users to send messages that cannot be traced back to them (web browsing, chat, p2p, etc.) • UI was mostly command line interface until recently • 2005 Tor GUI competition • CUPS team won phase 1 with design for Foxtor!
One-size-doesn’t-fit-all problem • Tor is configurable and different users will want to configure it in different ways • But most users won’t understand configuration options • Give users choices, not dilemmas • We began by trying to understand our users • No budget, little time, limited access to users • So we brainstormed about their needs, tried to imagine them, and develop personas for them
Persona Example 1 Jim is a current UG at CSM. Goals: • Be sure he’s on track to graduate in 4 years • Find some courses that are interesting • Get together with friends to study and have fun Other: Jim is taking a full course load and also working part time, so he’s always very busy. He also tends to be disorganized, so he keeps losing information and having to look it up again. He is a little shy and doesn’t know too many people in the department yet.
Persona Example 2 Susie is a parent researching schools for her son Bob, who will be graduating from HS soon. Goals: • She wants to find an environment that will be welcoming and stimulating for Bob • She thinks Bob may ultimately want to pursue graduate work, so she wants to be sure the school has faculty doing interesting research • She wants to find out how expensive the school is and what type of financial aid is available. Other: Susie works full time but considers her family to be a top priority. It’s very important to her for her son to be happy, so she’s willing to devote a fair amount of time to the task of selecting a university. The family has a computer at home, so she’s spending her evenings visiting websites to collect data. She’s comfortable surfing the web, but prefers websites that are logical and not too cluttered.
One-size-doesn’t-fit-all problem • The process led to realization that our users had 3 categories of privacy needs • Basic, selective, critical • Instead of asking users to figure out complicated settings, most of our configuration involves figuring out which types of privacy needs they have
Privacy laws around the world • Privacy laws and regulations vary widely throughout the world • US has mostly sector-specific laws, with relatively minimal protections - often referred to as “patchwork quilt” • Federal Trade Commission has jurisdiction over fraud and deceptive practices • Federal Communications Commission regulates telecommunications • European Data Protection Directive requires all European Union countries to adopt similar comprehensive privacy laws that recognize privacy as fundamental human right • Privacy commissions in each country (some countries have national and state commissions) • Many European companies non-compliant with privacy laws (2002 study found majority of UK web sites non-compliant)
Some US privacy laws • Bank Secrecy Act, 1970 • Fair Credit Reporting Act, 1971 • Privacy Act, 1974 • Right to Financial Privacy Act, 1978 • Cable TV Privacy Act, 1984 • Video Privacy Protection Act, 1988 • Family Educational Right to Privacy Act, 1993 • Electronic Communications Privacy Act, 1994 • Freedom of Information Act, 1966, 1991, 1996
US law – recent additions • HIPAA (Health Insurance Portability and Accountability Act, 1996) • When implemented, will protect medical records and other individually identifiable health information • COPPA (Children‘s Online Privacy Protection Act, 1998) • Web sites that target children must obtain parental consent before collecting personal information from children under the age of 13 • GLB (Gramm-Leach-Bliley-Act, 1999) • Requires privacy policy disclosure and opt-out mechanisms from financial service institutions
Voluntary privacy guidelines • Direct Marketing Association Privacy Promise http://www.thedma.org/library/privacy/privacypromise.shtml • Network Advertising Initiative Principles http://www.networkadvertising.org/ • CTIA Location-based privacy guidelineshttp://www.wow-com.com/news/press/body.cfm?record_id=907
Privacy policies • Policies let consumers know about site’s privacy practices • Consumers can then decide whether or not practices are acceptable, when to opt-in or opt-out, and who to do business with • The presence of privacy policies increases consumer trust What are some problems with privacy policies?
Privacy policy problems • BUT policies are often • difficult to understand • hard to find • take a long time to read • change without notice
Privacy policy components Identification of site, scope, contact info Types of information collected Including information about cookies How information is used Conditions under which information might be shared Information about opt-in/opt-out Information about access Information about data retention policies Information about seal programs Security assurances Children’s privacy There is lots of informationto convey -- but policyshould be brief andeasy-to-read too! What is opt-in? What is opt-out?
Short Notices • Project organized by Hunton & Williams law firm • Create short version (short notice) of a human-readable privacy notice for both web sites and paper handouts • Sometimes called a “layered notice” as short version would advise people to refer to long notice for more detail • Now being called “highlights notice” • Focus on reducing privacy policy to at most 7 boxes • Standardized format but only limited standardization of language • Proponents believe highlights format may eventually be mandated by law • Alternative proposals from privacy advocates focus on check boxes • Interest Internationally • http://www.privacyconference2003.org/resolution.asp • Interest in the US for financial privacy notices • http://www.ftc.gov/privacy/privacyinitiatives/ftcfinalreport060228.pdf
Checkbox proposal WE SHARE [DO NOT SHARE] PERSONAL INFORMATION WITH OTHER WEBSITES OR COMPANIES. Collection:YES NO We collect personal information directly from you We collect information about you from other sources: We use cookies on our website We use web bugs or other invisible collection methods We install monitoring programs on your computer Uses: We use information about you to:With Your Without YourConsent Consent Send you advertising mail Send you electronic mail Call you on the telephone Sharing: We allow others to use your information to: With Your Without Your Consent Consent Maintain shared databases about you Send you advertising mail Send you electronic mail Call you on the telephone N/AN/A Access: You can see and correct {ALL, SOME, NONE} of the information we have about you. Choices: You can opt-out of receiving from Us Affiliates Third Parties Advertising mail Electronic mail Telemarketing N/A Retention: We keep your personal data for: {Six Months Three Years Forever} Change: We can change our data use policy {AT ANY TIME, WITH NOTICE TO YOU, ONLY FOR DATA COLLECTED IN THE FUTURE}
Platform for Privacy Preferences Project (P3P) • Developed by the World Wide Web Consortium (W3C) http://www.w3.org/p3p/ • Final P3P1.0 Recommendation issued 16 April 2002 • Offers an easy way for web sites to communicate about their privacy policies in a standard machine-readable format • Can be deployed using existing web servers • Enables the development of tools (built into browsers or separate applications) that • Summarize privacy policies • Compare policies with user preferences • Alert and advise users
Training people not to fall for phish • Laboratory study of 28 non-expert computer users • Asked to evaluate 10 web sites, take 15 minute break, evaluate 10 more web sites • Experimental group read web-based training materials during break, control group played solitaire • Experimental group performed significantly better identifying phish after training • People can learn from web-based training materials, if only we could get them to read them!
How do we get people trained? • Most people don’t proactively look for training materials on the web • Many companies send “security notice” emails to their employees and/or customers • But these tend to be ignored • Too much to read • People don’t consider them relevant
Embedded training • Can we “train” people during their normal use of email to avoid phishing attacks? • Periodically, people get sent a training email • Training email looks like a phishing attack • If person falls for it, intervention warns and highlights what cues to look for in succinct and engaging format P. Kumaraguru, Y. Rhee, A. Acquisti, L. Cranor, J. Hong, and E. Nunge. Protecting People from Phishing: The Design and Evaluation of an Embedded Training Email System. CyLab Technical Report. CMU-CyLab-06-017, 2006. http://www.cylab.cmu.edu/default.aspx?id=2253
Embedded training evaluation • Lab study compared two prototype interventions to standard security notice emails from Ebay and PayPal • Existing practice of security notices is ineffective • Diagram intervention somewhat better • Comic strip intervention worked best • Interventions most effective when based on real brands
Examples • Ecommerce personalization systems • Concerns about use of user profiles • Software that “phones home” to fetch software updates or refresh content, report bugs, relay usage data, verify authorization keys, etc. • Concerns that software will track and profile users • Communications software (email, IM, chat) • Concerns about traffic monitoring, eavesdroppers • Presence systems (buddy lists, shared spaces, friend finders) • Concerns about limiting when info is shared and with whom
Issues to consider • Similar to issues to consider for privacy tools PLUS • Users may not be aware of privacy issues up front • When they find out about privacy issues they may be angry or confused, especially if they view notice as inadequate or defaults as unreasonable • Users may have to give up functionality or convenience, or spend more time configuring system for better privacy • Failure to address privacy issues adequately may lead to bad press and legal action
Provide way to set up default rules • Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info • There should be a way to set up default rules • Exclude all purchases • Exclude all purchases shipped to my work address • Exclude all movie purchases • Exclude all purchases I had gift wrapped
Remove excluded purchases from profile • Users should be able to remove items from profile • If purchase records are needed for legal reasons, users should be able to request that they not be accessible online