430 likes | 440 Views
Explore the complexities of privacy in the digital era, with insights on socio-technical systems, data protection basics, conflicting mindsets, and the social impact of big data. Understand the right to privacy amidst evolving technologies and regulatory responses. Delve into Fair Information Practice Principles and OECD Privacy Principles guiding data quality, collection, purpose specification, security, openness, access, accountability, and more. Enhance your understanding of privacy protection in a data-driven world.
E N D
OII Internet Leadership Academy:Privacy Ben Zevenbergen Oxford Internet Institute
Privacy… • “Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.” • Robert Post (Law Professor)
Privacy… • “Perhaps the most striking thing about the right to privacy, is that nobody seems to have any very clear idea what it is.” • Judith Jarvis Thomson (Philosopher)
Privacy… • “Privacy law is not the product of logic. But neither is it the product of ‘experience’ or of supposed ‘felt necessities’ that are shared in all modem societies. It is the product of local social anxieties and local ideals.” • John Q. Whitman (Law Professor)
Law in the books ≠ Law in action • “In data protection law, the gap between law in the books and law in action is particularly glaring […]” • “[…] this panoply of information practices, for the most part, proceeds under the halo of legality, quite literally, evoking gasps of disbelief among the newly informed.”
Sociotechnical system • “[…] information increasingly mediates our significant activities and relationships.”
Sociotechnical system • “[…] the sources of privacy threats are socio-technical systems, that is to say, technologies embedded in particular environments shaped by social, economic, and political factors and practices and put to specific purposes.”
Sociotechnical system • “The explosive growth of socio-technical information systems, […] not only divert information flows from one path to another and one recipient to another, or others, but also may reconfigure ontologies, yield new categories of information, and new types of actors and modes of dissemination. Such changes may call for the reconsideration of entrenched norms and development of norms where none previously may have existed.”
Sociotechnical system • “[…] privacy by design is more than just a matter of technological design, […] avoiding falling into techno-centric solutions to a sociotechnical problem.”
Sociotechnical system • “As socio-technological developments raise new regulatory challenges […] the regulatory response is to include these in the data protection framework. This, however, requires stretching the concept of personal data (sometimes to the point of breaking, or perhaps rather of becoming void of meaning)[…]”
Information & data behaviour • ‘Right to be forgotten’ case: • “You can't make digital information be deleted simply because you believe that's what happens to information. The whole concept of information has changed.” • Digital information mediates our lives • It’s dynamic, instead of static
The social impact of “big data” • “[…] an algorithm is only as good as the data it works with. Data mining can inherit the prejudices of prior decision-makers or reflect the widespread biases that persist in society at large. Often, the “patterns” it discovers are simply preexisting societal patterns of inequality and exclusion. Unthinking reliance on data mining can deny members of vulnerable groups full participation in society. […]” • S. Barocas, A. D. Selbst, Big Data’ Disparate Impact.
Conflicting mindsets • Engineering: + Efficiency + Data gathering/analysis + Scalability • Legal/ethical/social/civil society: • Civil liberties, freedom, etc. • What does the Internet mean for us? • Worries about decontextualisation
Conflicting mindsets • Political philosophy question: • How can we think about freedom in a technically mediated world? • How can we regulate our conception of privacy and freedom? • Are law and policy sufficient and/or vital? • If so, how can law be implemented, enforced, made meaningful, etc.?
Conflicting mindsets • “With greater conceptual clarity in understanding the meaning and value of privacy, we can better tackle the difficult task of protecting privacy in the Information Age.”
Data protection basics Fair Information Practice Principles, OECD Privacy Principles • Data quality – relevant, accurate, & up-to-date • Collection - limited, lawful & fair; with consent or knowledge • Purpose specification at time of collection • [Notice of purpose and rights at time of collection implied] • Uses & disclosures limited to purposes specified or compatible • Security through reasonable safeguards • Openness regarding personal data practices • Access – individual right of access • Correction – individual right of correction • Accountable – data controller with task of compliance • Data Minimisation – minimal collection • Data retention limits – relating to purpose • Sensitive information – religion, political or sexual orientation, etc.
Data protection basics • Threshold for personal data (EU) and personally identifiable information (US) • “Once information qualifies as identified or identifiable, it falls under the data protection regime. At that moment, a full suite of obligations and protections is triggered.”
United States Privacy Law • “In the United States, privacy law focuses on redressing consumer harm and balancing privacy with efficient commercial transactions.” • Sectoral approach: • Inherently contextual, • But incomplete regulatory maze, • Protected data types differ per sector.
European Privacy Law • Single & comprehensive framework • Focus on data subject ‘control’ • Three fallacies (B-J Koops) • Informational self-determination • Too much faith in controller • Comprehensive approach
European Privacy Law • “What the statutes describe and how the courts interpret this has usually only a marginal effect on data-processing practices.” • “[…] diverges from the reality of 21st-century data-processing practices.”
European Privacy Law • Fallacy 1: Single framework • Consent as main legal ground has no real practical meaning, • Practical vs. meaningful consent. • Difficult to enforce, • Cannot be used with many government information requirements.
European Privacy Law • Fallacy 2: Faith in controller • Goals of law not achieved in practice, • Ex ante ‘data protection by design’ will lead to legal compliance checklists, • Data protection thus seems alive through burdensome bureaucracy, • Logic and rationale must be understood vs. compliance mindset.
European Privacy Law • Fallacy 3: Comprehensive • Law in the books vs. law in action • Data protection laws did little to curb ‘big data’ developments • Data minimisation particularly important principle • But can we really say that it is meaningfully practiced?!
Instability of current system • Differences raise: • Compliance costs, • Technical challenges for software and service design, • Bilateral trade issues. • “It is clear that if EU citizens do not have the same level of protections as the US citizens, because of the practices of the US intelligence services and the lack of effective protections, they will become the first victims of these systems.”
Policy vacuum? • Given the international instability, • And 21st century data processing techniques, • A policy vacuum appears, • How to conceptualise and formulate good policies?
Information Privacy Theory • Some examples of how scholars currently think about privacy, • Interrelated, not necessarily mutually exclusive, • Brief overviews of some of the most important points.
Information Privacy Theory • “The right to be let alone” (Warren and Brandeis, 1890), • “The claim of persons to determine for themselves when, how, and to what extent information about them is communicated to others” (Westin 1967), • “[...] in the context of modern data processing, [...] respect the capacity of the individual to determine in principle the disclosure and use of his/her personal data. Limitations to this informational self-determination are allowed only in case of overriding public interest” (German Constitutional Court, 1983), • “Information privacy theory constantly evolves with the introduction and implementation of new information technologies” (Solove, 2012).
Contextual integrity – Helen Nissenbaum • “[…] holds the source of this anxiety to be neither in control nor secrecy, but appropriateness.” • “Specifically, technologies, systems, and practices that disturb our sense of privacy are those that have resulted in inappropriate flows of personal information.” • “Inappropriate information flows are those that violate context specific informational norms […]”
Contextual integrity – Helen Nissenbaum • Violation of information privacy can occur when information moves across contexts. • Context-relative informational norms, where the flow and use of specific information is considered to be inappropriate: • Actors (subject of information, capacity of recipient and power-balance with regards to the sender); • Attributes (data types of information); • Transmission principles (constraints/rules under which information flows).
Contextual integrity – Helen Nissenbaum These parameters should be imagined to be • “[…] juggling balls in the air, moving in sync: contexts, subjects, senders receivers, information types, and transmission principles.” Engineering privacy
Social justificationDaniel Solove (based on John Dewey) • “Framing privacy exclusively in individualistic terms often results in privacy being undervalued in utilitarian balancing […]” • “[…] rights should be valued based on “the contribution they make to the welfare of the community.”
Social justificationDaniel Solove (based on John Dewey) • “Privacy is a set of protections against a related set of problems. These problems are not all related in the same way, but they resemble each other.” • “[…] pluralistic concept with social value.” • “[…] itself a form of social control that emerges from the norms and values of society.”
Social justificationDaniel Solove (based on John Dewey) • “[…] the value of protecting the individual is a social one.” • “A society without privacy protection would be suffocating, and it might not be a place in which most would want to live.” • “Even when it protects the individual, it does so for the sake of society.”
Engineering PrivacyS. Gürses, C. Troncoso, C. Diaz • Legal privacy concepts are too vague, • Need to activate them in systems: • Elicit and analyse privacy concerns, • Translate into functional requirements, • Develop designs, implement, test.
Engineering PrivacyS. Gürses, C. Troncoso, C. Diaz • State-of-the-art in computing defy imagination of average persons, • Electronic systems may provide the same functionality, • But still offer more in terms of anonymity, • Protection my not be complete, but requires a lot more effort to breach.
Engineering PrivacyS. Gürses, C. Troncoso, C. Diaz • Computational techniques + Comprehension of social, political, and economic conceptions of privacy (and surveillance) • Necessary in order to grasp the problems for systems design. • Combination cannot be replaced by mere technical improvements.
PII 2.0P.M. Schwartz & D.J. Solove • Three tiers, instead of two: • (1) identified • (2) identifiable, • Not all laws apply • Unless… • or (3) non-identifiable person
Path Dependency? • Collingridge Dilemma: • "When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult, and time-consuming.” • D. Collingridge, The Social Control of Technology
Discussion • What will freedom mean? • How do we regulate it? • Is there a need for change? • Are we on the right path? • If not, how do we get to the right path?