160 likes | 319 Views
Protecting Privacy: Who Is Responsible?. Clark Thomborson The University of Auckland 12 September, 2013. Designing for Privacy: Some Hard Questions. If a computerised system supports privacy, what should it do? What shouldn’t it do?
E N D
Protecting Privacy: Who Is Responsible? Clark Thomborson The University of Auckland 12 September, 2013
Designing for Privacy: Some Hard Questions • If a computerised system supports privacy, what should it do? What shouldn’t it do? • “Privacy” varies greatly, depending on the legal, cultural, individual, and organisationalcontext. • Can we elicit privacy requirements from stakeholders before their privacy is violated? • Stakeholders define “what is required” (laws, preferences, complaints, …) – often in retrospect! • Can responsibility for privacy protection be shared fairly between government, private enterprise, and individuals? • Is this a complete list of stakeholders?
Asset-Based Privacy Analysis? • We usually start a security analysis by identifying assets, vulnerabilities, and risks. • Can you list your high-value “privacy assets”? • Personally identifiable information (PII): • financial, • medical, … • Is PII the only entry on your list of assets? • favourite movies? • IP addresses of the devices you use? • IP addresses of the devices in your vicinity?? • Is there a better way to think about privacy assets???
Westin’s “Four Basic States of Privacy” Solitude Confidence/Intimacy Anonymity Reserve P−: Don’t look P−: Don’t show
Privacy Diagrams • Privacy diagrams are like use-case diagrams. • Stakeholders are icons or stick-figures. • Dashed line: an alias or persona • We show different “faces” to different audiences. • Pentagon: a security boundary • Read/write access is controlled • Circle: a socially-understood context • A subcircle below is a “commons”: visible to everyone in the enclosing circle. • A subcircle above is an enforcement agent: trusted to observe and control. P− is a prohibition; P+ is a permission; O+ is an obligation; O− is an exemption
Solitude as a Privacy Asset • Is solitude an important privacy asset for you? • What personal benefits do you gain? • Are there societal benefits from personal solitude? In what contexts? • Is it sometimes a liability? In what contexts? • How do you indicate that you’re “in solitude”? • How do you indicate that “visitors are welcome”? • Who threatens your solitude? • What is their motive? • Who (or what) do you trust to defend your solitude?
Intimacy as a Privacy Asset • We can distinguish a personal intimacy from a professional confidence. • Friendships are reciprocal: friends reveal intimate secrets. • Confidences are usually one-sided: a truster and a trustee. • Most of us value our friendships… • Do all friendships have a shared (intimate) secret at their core? • Are the identities of our friends a secret? • “Chatham House rules”. • What about acquaintances? • No secrets, no confidences… • Are their identities one of our privacy assets?
Confidence as a Privacy Asset • What confidential information do you reveal to which professionals? • Legal professionals • Health professionals • Financial professionals • Life coaches • … • Could you rank your confidence assets by value: low, medium, high? • Would your asset ranking be confidential to your privacy-risk analyst? • Note: confidences are a privacy risk to the truster, and a security risk to the trustee.
Anonymity as a Privacy Asset • What value do you place on being able to “hide in a crowd”? • Do you want your merchant to recognise you, so that they can give you personalised service? • Do you shop anonymously, at least on occasion, to estimate the cost/benefit of your personalised service from e.g. Amazon.com? • Is anonymity a social good, and if so, in what contexts? • Is anonymity a social harm, and if so, in what contexts? • Note: the EU’s Data Directive treats anonymity as an instrumental right. Major social benefits are gained, if PII is highly regulated, but PII regulation is not an end in itself.
Reserve as a Privacy Asset P−: Don’t look P−: Don’t show • “Trixi earns her GAPE [“Greatest Advance in Privacy Erosion”] award for • using the Post Office foyer to tell friends and passersby all about Brett and his icky courting habits • at the same time as listening to her iPod, chewing gum, painting her fingernails and spraying her hair. • “Oh, and also laying out and inspecting the prosthetic and prophylactic contents of her handbag. • “Well, it was a public space and Trixi is a fee-paying member of the public, eh? • “Try not to wince at the thought.” [David Hill, “Public spaces not the place to air out one’s privates”, NZ Herald, 18 Jan 2013]
Who Benefits from Reserve? • Reserve is a social good • It’s not a matter of personal choice: society at large determines if we “show too much” • We risk ostracism or adverse comment, if we don’t respect the reserve • Reserve is a private good • For people who don’t want to “see too much” • For people who occasionally make a mistake, and appreciate others “looking away” while they cover up. • What forms of reserve do you appreciate? • What forms of reserve do you think are “good for society”?
Any other types of privacy asset? Solitude Confidence/Intimacy Anonymity Reserve P−: Don’t look P−: Don’t show
Group Privacy • Most societies define special privacy boundaries for certain types of groups. • Domestic privacy, in homes • Private religious ceremonies, in churches • Secret societies, e.g. in Masonic lodges • These are intimate spaces, with rituals of entry and exit, member-recognition, initiation, ….
Domestic Privacy, in the Quran Allah O+: be hospitable P+: offer Salaam P+: Enter with permission
Partial Answers to Our Hard Questions • If a computerised system supports privacy, what should it do? What shouldn’t it do? • Hmmm… this is complicated! Ask some stakeholders… • Can we elicit privacy requirements from stakeholders before their privacy is violated? • Maybe… but can we find representative stakeholders? Will they understand our privacy diagrams? • Can responsibility for privacy protection be shared fairly between government, private enterprise, and individuals? • Hmmm… don’t forget families and other socially-sanctioned private groups!
Privacy Diagrams Can Answer Some Questions • Privacy diagrams represent the architectural aspects of privacy: • “What privacy assets could be controlled by a computer?” • Privacy diagrams could be used for the elicitation of privacy requirements: • “What privacy assets should be controlled?” • Privacy diagrams could be used for the evaluation of privacy protections: • “What privacy assets are controlled, in which contexts?” • This is unpublished work. Please let me know what you think: • Do you understand these diagrams, are they too abstract? • Have these diagrams changed the way you think about privacy?