260 likes | 383 Views
Security & Morality: A Tale of User Deceit. Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina. Security & Morality: A Tale of User Deceit? . Hypotheses about human trust behavior developed from social science
E N D
Security & Morality:A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina Camp, McGrath, Genkina
Security & Morality:A Tale of User Deceit? • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina
Design for Trust Start with human trust behaviors Trust Used for simplification Encompasses discrete technical problems privacy, integrity, data security Embeds discrete policy problems business behavior, customer service, quality of goods, privacy Camp, McGrath, Genkina
Human and Computer Trust Trust is approached differently by different disciplines Social Studies of Human Behavior Studies based on a micro approach Experiments to evaluate how people extend trust Game theory Common assumption: information exposure == trust Philosophy Macro approach Trust is a need high default to trust Trust is a tool for simplification Examine societies and cultural practices Camp, McGrath, Genkina
Experimental Definition of Trust • Coleman’s Three Part Test • enables something not otherwise possible • individual who trusts is worse off if the trusted party acts in an untrustworthy manner • individuals who trust are better off if the trusted party acts in a trustworthy manner • there is no constraint placed on the trusted party • a time lag exists between a decision to trust and the outcome Camp, McGrath, Genkina
Trust & Individiation • People interacting with a computer do not distinguish between computers as individuals but rather respond to their experience with "computers” • People begin too trusting • People learn to trust computers • first observed by Sproull on net in computer scientists in 1991 • confirmed by later experiments • Computers are perceived as moral agents • People will continue to extend trust - so creating another source of trust doesn’t defeat trusting behaviors Camp, McGrath, Genkina
Research on Humans Suggest... • Humans may not differentiate between machines • Humans become more trusting of ‘the network’ • Humans begin with too much trust for computers • Confirmed by philosophical macro observation • Confirmed by computer security incidents • E-mail based Scams, Viruses & Hoaxes • Masquerade attacks Camp, McGrath, Genkina
Three Hypotheses • Do humans respond differently to human or computer "betrayals" in terms of forgiveness? • Do people interacting with a computer distinguish between computers as individuals or respond to their experience with "computers”? • Does tendency to differentiate between remote machines increase with computer experience? Camp, McGrath, Genkina
Security & Morality:A Tale of User Deceit • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina
H1: Response to Failure • Do humans respond differently to human or computer "betrayals" in terms of forgiveness? • Attacks which are viewed as failures as ‘ignored’ or forgiven • Technical failures as seen as accidents rather than design decisions • May explain why people tolerate repeated security failures Camp, McGrath, Genkina
H2: Differentiation • When people interact with networked computers, they discriminate among distinct computers (hosts, websites), treating them as distinct entities, particularly in their readiness to extend trust and secure themselves from possible harms. • People become more trusting over time • People differentiate more not less with experience • Do people learn to differentiate or trust? • “educate the user” may not work Camp, McGrath, Genkina
Security & Morality:A Tale of User Deceit • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina
The Experiment • Developed three websites • “life management” • Elephantmine.com • Reminders.name • MemoryMinder.us Camp, McGrath, Genkina
Initial Tests • What information would you share with each site? • Do you trust the site? • user-defined trust, no macro definition given • Rejected MemoryMinders.us • people dislike lime green? • Other two designs had similar evaluations Camp, McGrath, Genkina
Two “Betrayal” Types • One group faced a technical betrayal • Another person’s data is displayed • “John Q. Wilson” • DoB, Credit Card Number, social network data • One group faced a moral betrayal • Change in privacy policy announced • Collection of third party information correlated with compiled data • very common policy • eBay, Face Book, mySpace Camp, McGrath, Genkina
Three Step Process • Users introduced to first site • Sites in the same order • Users experience betrayal • Half the users have technical failure • Half had privacy change • Both sets of users experience a failure upon departure of first site • Then users go to second site Camp, McGrath, Genkina
Findings: Differentiation • Users respond to first site betrayal with significant change in behavior wrt second site • users had on average seven years experience with Internet • computer experience not at all significant • second site not seen as “new” entity • Cannot support the hypothesis that users differentiate • users do not enter each transaction with a new calculation of risk Camp, McGrath, Genkina
Findings: Betrayal Type • Stronger reaction to privacy change • Yet technical failure indicated an inability to protect privacy Camp, McGrath, Genkina
Security & Morality:A Tale of User Deceit • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina
What To Conclude • Assuming the human will act like the computer has been a core design problem • Either remove assumptions about humans • Or computer security must be designed with social science in mind Camp, McGrath, Genkina
Differentiation • The tendency to differentiate between remote machines decreases with computer experience • More use results in more lumping • Make better lumping • Explains common logon/passwords • along with cognitive limits • “My Internet is Down” • Need explicit DO NOT TRUST signals Camp, McGrath, Genkina
Observations • Users are bad security managers • PGP, P3P, passwords, …. • Security should necessarily be a default • Surveys illustrate a continuing confusion of privacy & security • educate All Net Users OR • build upon the connection between the moral (privacy) and technical (security) Camp, McGrath, Genkina
Computer security is built for machines • Passwords • Humans are a bad source of entropy • SSL • Two categories: secure and not secure • By requiring per-site differentiation does not enable human differentiation • Every site should include a unique graphic with the lock • Trust all machines with the lock • SSL - secured phishing has already occurred Camp, McGrath, Genkina
PKI is built for Machines • Better lumping, not demands for user differentiation • Different levels of key revocation are needed • Falsified initial credential • All past transactions suspect • Change in status • Future transactions prohibited • Unrecognized hierarchy • Messages are confusing • No domain • No alert when moving to IP address space not connected to DNS Camp, McGrath, Genkina
Building for Trust • Security technologies are not adopted • patching, PGP • Security technologies do not address user conceptions of trust • Patching • more secure machine with regular updates to Microsoft? • PGP • signed email w/o confidentiality to most people • Technologies linking security (competence) to privacy (beneficence) may prove more effective in trust building than security alone Camp, McGrath, Genkina
Example Project • Focused on individuals • computer - computer trust • computer- human trust • Explicit “do not trust” signals Camp, McGrath, Genkina