430 likes | 607 Views
Privacy Technology: Can we afford to wait for the future?. CACR 3rd Annual Privacy and Security Workshop November 7-8, 2002. Peter Hope-Tindall Privacy Architect (pht@dataPrivacy.com). Privacy by Design ™. Privacy.
E N D
Privacy Technology: Can we afford to wait for the future? CACR 3rd Annual Privacy and Security Workshop November 7-8, 2002 Peter Hope-Tindall Privacy Architect (pht@dataPrivacy.com) Privacy by Design™
Privacy “the right to exercise control over your personal information.” Ann Cavoukian “the right to be let alone”Warren & Brandeis “Privacy is at the heart of liberty in the modern state.”Alan Westin
Defining Privacy Traditional legal and societal test has been framed around “Reasonable Expectation of Privacy” test. Often used to justify removal of privacy. Fails to fully understand and express the complex nature of Privacy.
Example: CCTV in Public Spaces • Most commentators would suggest we have “No reasonable expectation of Privacy” • Justification for CCTV or other tracking. Not that simple: • In a public place, • Observable – Expectation that I will be observed • Anonymous/Pseudonymous - Expectation • May/may not be Linkable
Marketplace • Different definitions of Privacy • Privacy confused with Security (sometimes intentionally) • Privacy as a Marketing tool • Must give up Privacy to improve Security • Technology solutions looking for a problem • Snake Oil • Little in the way of Standards for Products and Services • Nothing in the way of Certification & Testing
Present day Problems • Traditional IT dogma has encouraged the collection of information. • Opportunistic database design • Driven by hardware/software limitations of the past • Creates Other Problems (If you collect it; you need to protect it!) • Privacy Aware IT will discourage the collection of information. • Minimalist database and system design • Justification
Security and Privacy • data protection - FIPs (not FIPS) • authentication • data-integrity • confidentiality • access controls • non-repudiation Privacy Security n.b. FIPs: Fair Information Practices FIPS: Federal Information Processing Standards
Security vs. Privacy • Accountable to the data subject. • Capabilities based assessment.(is it possible?) • Access and use controls defined by use limitation and consent of data subject and legislation. • Protecting against outsiders, insiders and system owner. • Accountable to President/CEO Board of Directors. • Risk based assessment. (how likely is it?) • Access and use controls defined by the system owner. • Has been focused on protecting against outsiders.
Different Approaches to PrivacyBuild in elements of personal Consent and Control • Central Repository/Decision Model – a rule based or heuristic Privacy Model; EPM • Divide and Conquer – strategic pseudonymisation/anonymisation • Smart Hardware • Privacy Rules Embedded in Hardware • Smart Data • Encapsulate Methods inside the data
Privacy Enhancing Technologies • Anonymization/Pseudonamization Tools • Proxies / Intelligent Agents • Firewalls / Filters • Privacy Labels • Onion Routing • Policy Tools • Encryption Tools & Services
What is Privacy Architecture • Initial View Privacy Security Application Technology Data Network
Reality Privacy Security Application Technology Data Network
Privacy Framework/Strategy Policy(PIA) Technology(Privacy Architecture)
Privacy Framework • Summary of Legislation, Practices, Directives, Policies, High Level Overview of Proposed System. • Customized • Best Practices • Document can be used as an early demonstration of good faith and approach • Privacy chapter in RFP
PIA: Privacy Impact Assessment • Diagnostic Tool • Identifies Issues • May Respond to Issues with non-technical solutions • May identify Issues to be resolved in Privacy Architecture • Active and Passive: Introduce elements of individual consent and control
Privacy Architecture • Diagnostic Tool • Identifies Issues & Options • May Respond to Issues with technical solutions • May identify Issues to be resolved in PIA/Policy • Active and Passive: Introduce elements of individual consent and control
What is Privacy Architecture? • Allow technical privacy problems identified in other architectures to be overcome. • Bring together the privacy components of all architectures in a single Privacy Chapter in the design book (This can then be presented as the ‘Technical Privacy Design’ of an entire project. • Look for opportunities (Technical – in an active manner) for the introduction of privacy enhancing components, which will tend to introduce elements of consent and individual control into the technical architecture.
What is Privacy Architecture? • Look for opportunities (Technical – in a responsive manner) for the introduction of compensating components in response to issues raised during conceptual and logical design, in response to issues identified in a PIA, and in response to policy decisions made. • Provide privacy oversight and expertise to the architectural development sessions, definition of terms, to participate in the foundational grounding of all of the architecture areas.
What is Privacy Architecture?Summary • Technical Privacy Leadership • Focus of Privacy • Responsive Role • Active Role • Educational Role
Problems with the Traditional PIA • Often encourages ‘compliance mentality’ • Point of pain may become point of no solution • Risk that issues may be reported and forgotten • Emphasizes Policy and Legislative solutions not technical solutions • Integration with IT Architecture group problems
How do we measure success? • Identity • Measures the degree to which information is personally identifiable. • Linkability • Measures the degree to which data tuples or transactions are linked to each other. • Observability • Measures the degree to which identity or linkability may be impacted from the use of a system. With thanks and apologies to the Common Criteria
Identity (nymity) Measures the degree to which information is personally identifiable. Anonymity Non-ReversiblePseudonymity ReversiblePseudonymity Verinymity The quality or state of being unknown. without name from Greek pseudonumon, neuter of pseudonumos, falsely named from Latin verus, true, truly named
Linkability This metric requires n data elements. Where n > 1. Measures the degree to which data elements are linked to each other. (Identity measurement can be thought of as the degree to which data elements are linkable to the verinym or true name of the data subject). Unlinkability Full Linkability It cannot be determined which set of transactions belong which each other. It may be fully determined which set of transactions belong with each other. Example: Transactions belonging to the same individual.
Linkability The requirements for unlinkability are intended to protect the user against the use of profiling of operations. For example, when a telephone smart card is employed with a unique number, the telephone company can determine the behavior of the user of this telephone card. Hiding the relationship between different invocations of a service or access of a resource will prevent this kind of information gathering. Unlinkability requires that different operations cannot be related. This relationship can take several forms. For example, the user associated with the operation, or the terminal which initiated the action, or the time the action was executed. The primary solution to linkability is generally the token based approach, with an awareness of other factors (time, location, message contents (which we refer to as observability)) which could also tend to allow transactions to be linked. In addition, approaches such as message padding and ‘salting’ are employed to prevent data matches.
Observability Measures the degree to which identity or linkability may be impacted from the use of a system. Non Observability Full Observability Nothing can be inferred from the record of the use of a system. No record is made of the use of resources, location or transactions. Identity or Linkability can be inferred from the record of the use of a system. Full audit record is made of the use of resources, location or transactions.
Identity Observability Linkability
Target… • Decrease amounts of identity • Decrease amounts of linkability • Decrease amounts of Observability
De-identify De-Link
Let’s Simplify • Simple Artifacts that can be utilized anywhere within the architecture: De-identification Service De-linking Service De-observability Service Consent Collection Service Consent Verification Service
Summary • Objective Metric • Encourages a multi-discipline approach • Allows privacy success of new measures to be quantified even with today’s non-optimal technology • Allow privacy impact of new measures to be minimized • Allows iteration and improvement
Success Considerations • Open discussion • Comes naturally to technologists but not always to government or liability conscious companies • Technology is not evil despite what some would have us believe • Statutory Protection • Develop the Best Technology and the Best Policy • Search for improvement • It’s not easy - Privacy without tools/technologies is hard • Technology, law and policy/practices; we need all three!
Concerns • Lawful Access - Public Safety & Privacy • Privacy Sensitive Projects • Infrastructure with surveillance opportunity • Smart Cards/PKI • Biometrics • Data Aggregation (Physical or Logical) • Federated data warehouse • Where Auditability requires Identity • Reversible Pseudonymity is an option • Cryptographic key for identity resolution in custody of oversight body
Recommendations • Build an accurate data and system model • Attempt to align privacy and security • PETs • honest threat models • make anonymity and pseudonymity the default wherever possible • In case of impasse • Choice of last resort • ensure that privacy invasive security actually helps • raise the bar
Interesting Technology • Biometric Encryption • Digital Credentials – Stefan Brands • www.credentica.com • “PKI Lite” – PKI primitives without all of the trust & cross certification questions answered.
Resources • http://www.privacyarchitecture.com • “Rethinking Public Key Infrastructures and Digital Certificates: Building in Privacy,” “ISBN 0-262-02491-8, MIT Press, August 2000 • http://www.ipc.on.ca • Roger Clarke • http://www.anu.edu.au/people/Roger.Clarke/
Contact Information Peter Hope-Tindall dataPrivacy Partners Ltd. 5744 Prairie Circle Mississauga, ON L5N 6B5 Phone: +1 (416) 410-0240 E-Mail: pht@dataprivacy.com