490 likes | 618 Views
Privacy Technology. Analysis and Mechanisms. David Chaum. Privacy is fundamentally important!!!. Is essential for democracy Needed for participation without fear of retribution Is a fundamental human right. Analysis Policy Economic Solution Mechanisms Legal Technological
E N D
Privacy Technology Analysis and Mechanisms David Chaum
Privacy is fundamentallyimportant!!! • Is essential for democracy • Needed for participation without fear of retribution • Is a fundamental human right
Analysis Policy Economic Solution Mechanisms Legal Technological “Privacy Technology” OUTLINE
Policy Analysis The actors and macro considerations
Hierarchy of IT Needs of Humans • Self-Worth—relation to: artificial intelligence, etc. • Privacy—identity, credential & role protection • Interaction—communication, exploration, commerce • Security—uptime, robustness, no hacking • Processing—storage, interface, crunching Maslow’s Hierarchy of Needs
Economic Analysis These days, everybody’s an economist!
Monetizing privacy • Various schemes proposed (even 20+ years ago) • Consumers pay for privacy protection services • Consumers are paid for use of their privacy-related data • A brokerage of privacy related data
Imbalance in desire for privacy/data • Individuals discount present value of privacy protection in transactions • Explains anomalous behavior of consumers when confronted with cost or inconvenience • Practices and potential dangers unknown • Organizations value personal data • Overestimate future potential of data • Discount exposure to organization • An organization not too concerned about dangers posed to consumers that it is not accountable for
Imbalance in size/power of entities • Organizations have lots of leverage • Their are few sources of mass products and services • Consumers don’t have much choice for many products or services • High relative cost of change of practices for consumers
Legal mechanisms Powerful but don’t work well directly
Legal mechanisms—evolution • Originally based on codifying legitimate expectation of privacy • People should be able to review and amend data • No erosion of privacy due to technology • Best privacy protection practical
Legal mechanisms—capabilities • Accountability after the fact is ineffective • Hardly able to address • Covert/clandestine abuse • Abuse of public or leaked data • Corporate shield • Undoing damage done to people • Can cause creation and use of infrastructure
Technological Mechanisms The directly-effective mechanism
Locus of privacy-related control—The critical architectural choice Organization x infomediary
Locus of control—Three choices: • At organizations • Weak benefit/effect for consumers • Clandestine abuse, leaks, reversibility… • Mollify/diffuse the issue – prevent effective solutions • At an intermediary • Create infrastructure with single point of failure • Full cost but little true benefit • Dangerous concentration • At the individual • Privacy technology – the only good solution
Old paradigm—assumptions/model proven false! • Believed to be a zero-sum game, privacy v. security • ID believed needed for security against abuse by individuals • ID believed only way to organize data
New paradigm • Individuals provide organizations with minimum sufficient information and proof of its correctness
Privacy Technology Win-Win break of the believed tradeoff
Feasibility of a comprehensive solution set has been proven • Payments—eCash payments deployed by major banks on 4 continents • Communication—Mix nets, onion routing, etc. have been widely deployed • Credentials—mechanisms implemented on cards and by IBM
Benefits to organizations (micro) • Reduced exposure/liability • Better data • Cleaner because less deception and garbage • More willingness to provide data because of protections • All organizations get the data; level playing field • Better public image (?) – probably wrong!
Not easy to get there from here • Requires lots of users (hard to be anonymous alone!) • Difficult to get the system “primed” • Consumers don’t want to pay costs • Organizations tend to resist change
Really an “infrastructure issue” • Pseudonymity / Anonymity only “in numbers” (as mentioned) • Communication infrastructure can nullify protections • Way to share data pseudonymously is infrastructure
CONCLUSION A “Privacy Technology” infrastructure is the way to go and would be hugely beneficial
privacy / consumer-control Kinds of Privacy for Payments Organization-controlled privacy Consumer- controlled privacy No privacy False privacy Protection only from merchant credit cards on the Internet eCash™ Buy/reload card without identification Advertise consumer privacy stored-value cards technology / time pre-paid phone cards Government payments, e.g. transfer-order systems bank notes & coins
Consumer Payments Market Space scheduledpayments irregularpayments low value high value $10
You can buy a digital “bearer” instrument from a bank with funds in your account You can pay by giving the instrument to the payee, who deposits to an account Electronic Cash
Privacy and Control over Payments • Nobody can learn without your cooperationwho you pay, how much you pay, or when • You can always prove who received any payment, for how much, and when • Payments can only be made by you and they cannot be stopped by others
You deal with each organization under a distinct “digital pseudonym”—a public key whose corresponding private key only you know You obtain a “credential” as a digital signature formed on one of your digital pseudonyms You answer the queries you choose to by proving you have sufficient credentials Credential Mechanisms
A tamper-resistant chip, issued by a trusted authority, is carried by the individual But the chip can only talk to the outside world through the person’s PC/PDA The two devices perform a multiparty computation and thus speak to the outside world with a common voice Wallet with Observer
message 1 message 2 message 3 message 4 How untraceable-sending works Mix network The “mix” sever decrypts and re-orders inputs
? Prevents tracing messages back message 2
Cascade of three Mixes PK3 PK1 PK2 Server 3 Server 2 Server 1
Encryption of message PK3 PK1 PK2 message Ciphertext = EPK1[EPK2[EPK3[message]]]
m1 decrypt and permute m2 decrypt and permute m2 m2 decrypt and permute m3 m2 m3 m1 m1 m1 m3 m3 Processing the messages Server 1 Server 2 Server 3
? Tracing prevented by any mix Server 1 Server 3 Server 2 m3
The Information Awareness Office (IAO) develops and demonstrates information technologies and systems to counter asymmetric threats by achieving total information awareness useful for preemption, national security warning and national security decision-making. John Poindexter, national security adviser to former President Reagan, is the director of the new agency. He was a controversial figure both for his role in the Iran-contra scandals and for his efforts to assert military influence over commercial computer security technologies. NSDD 145 & Data Mining. IAO