250 likes | 261 Views
Explore the requirements and challenges of anonymity, unobservability, pseudonymity, and identity management in the context of an Ambient Intelligence (AmI) world. This article discusses the protection goals, threats, and protection measures against unauthorized access, modification, and withholding of information. It also highlights the importance of trust in organizing modern societies and the need for cooperation based on mutual distrust.
E N D
Anonymity, unobservability, pseudonymity and identity management requirements for an AmI world Andreas Pfitzmann Dresden University of Technology, Department of Computer Science, D-01062 DresdenPhone: 0351/ 463-38277, e-mail: pfitza@inf.tu-dresden.de, http://dud.inf.tu-dresden.de/
Excerpts from: Treaty Establishing a Constitution for Europe Article I-2 The Union's values The Union is founded on the values of respect for human dignity, freedom, democracy, equality, the rule of law and respect for human rights, including the rights of persons belonging to minorities. ... Article I-3 The Union's objectives 2. The Union shall offer its citizens an area of freedom, security and justice without internal frontiers, and an internal market where competition is free and undistorted.
Excerpts from: Treaty Establishing a Constitution for Europe Article II-68 Protection of personal data Everyone has the right to the protection of personal data concerning him or her. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.
Distrust is the basis Cooperation on the basis of mutual distrust (e.g. separation of powers, checks and balances) is the basis of organizing modern societies, not trust.
threats:1) unauthorized access to information 2) unauthorized modification of information 3) unauthorized withholding ofinformation or resources protection goals:confidentiality integrity availability ≥ total correctness partial correctness no classification, but pragmatically useful example: unauthorized modification of a program Threats and corresponding protection goals for authorized users 1) cannot be detected, but can be prevented; cannot be reversed2)+3) cannot be prevented, but can be detected; can be reversed
Distrust is the basis, revisited Cooperation on the basis of mutual distrust (e.g. separation of powers, checks and balances) is the basis of organizing modern societies, not trust. Cf. confidentiality vs. integrity / availability : You can’t check whether your trust has been justified even after the fact vs. you can check whether your trust has been justified.
transitivepropagation of “errors” machine X exe-cutes program Y Y X Transitive propagation of errors and attacks symbol explanation computer program A used B todesign C A C B
universal commands universal (covert)input channel Trojan horse unauthorized disclosure of information (covert)output channel unauthorizedmodification of information write access write accessnon-terminationresource consumption Trojan horse unauthorized withholding of information or resources
includes user, operator, service and maintenance ... of the system used Protection against whom ? Laws and forces of nature - components are growing old - excess voltage (lightning, EMP) - voltage loss - flooding (storm tide, break of water pipe) - change of temperature ... faulttolerance Human beings - outsider - user of the system - operator of the system - service and maintenance - producer of the system - designer of the system - producer of the tools to design and produce - designer of the tools to design and produce - producer of the tools to design and produce the tools to design and produce - designer ... Trojan horse • universal • transitive
physical distribution and redundance unobservability, anonymity, unlinkability: avoid the ability to gather “unnecessary data” Which protection measures against which attacker ? protection concerning protection against to achieve the intended to prevent the unintended designer and producer of the tools to design and produce intermediate languages and intermediate results, which are analyzed independently see above + several independent designers designer of the system independent analysis of the product producer of the system service and maintenance control as if a new product, see above restrict physical access, restrict and log logical access operator of the system physical and logical restriction of access user of the system protect the system physically and protect data cryptographically from outsiders outsiders
Multilateral security • Each party has its particular protection goals. • Each party can formulate its protection goals. • Security conflicts are recognized and compromises negotiated. • Each party can enforce its protection goals within the agreed compromise. Security with minimal assumptions about others
Protection Goals: Sorting Content Circumstances Prevent the unintended Confidentiality Hiding Anonymity Unobservability Achieve the intended Integrity Accountability Reachability Legal Enforceability Availability
Protection Goals: Definitions Confidentialityensures the confidentiality of user data when they are transferred. This assures that nobody apart from the communicants can discover the content of the communication. Hidingensures the confidentiality of the transfer of confidential user data. This means that nobody apart from the communicants can discover the existence of confidential communication. Anonymityensures that a user can use a resource or service without disclosing his/her identity. Not even the communicants can discover the identity of each other. Unobservabilityensures that a user can use a resource or service without others being able to observe that the resource or service is being used. Parties not involved in the communication can observe neither the sending nor the receiving of messages. Integrity ensures that modifications of communicated content (including the sender’s name, if one is provided) are detected by the recipient(s). Accountability ensures that sender and recipients of information cannot successfully deny having sent or received the information. This means that communication takes place in a provable way. Availabilityensures that communicated messages are available when the user wants to use them. Reachabilityensures that a peer entity (user, machine, etc.) either can or cannot be contacted depending on user interests. Legal enforceabilityensures that a user can be held liable to fulfill his/her legal responsibilities within a reasonable period of time.
+ + – – + implies weakens strengthens Correlations between protection goals Confidentiality Hiding Anonymity Unobservability Integrity Accountability Reachability Legal Enforceability Availability
Golden rule Since tamper-resistance of HW is all but good and organizations are far from perfect keeping secrets: Correspondence between organizational and IT structures Personal data should be gathered, processed and stored, if at all, by IT in the hands of the individual concerned.
Superposed sending (DC-network) ..... ..... ..... ..... ... ... ... ... + + + + + + + + D. Chaum 1985 for finite fields A. Pfitzmann 1990 for abelian groups station 1 M1 3A781 K12 2DE92 K13 4265B 99B6E station 2 M2 00000 anonymous access -K12 E327E 4AE41 3A781 = M1 M2 M3 K23 67CD3 67EE2 station 3 M3 00000 -K13 CEAB5 User station -K23 A943D Pseudo-random bit-stream generator + Modulo- 16-Adder Anonymity of the sender If stations are connected by keys the value of which is completely unknown to the attacker, tapping all lines does not give him any information about the sender.
c1 (z4,c2(z1,M1)) c1 (z5,c2(z2,M2)) c1 (z6,c2(z3,M3)) d1(c1(zi,Mi)) = (zi,Mi) c2 (z3,M3) c2 (z1,M1) c2 (z2,M2) d2(c2(zi,Mi)) = (zi,Mi) M2 M3 M1 Protection of the communication relation: MIX-network D.Chaum 1981 for electronic mail MIX1batches, discards repeats, MIX2batches, discards repeats,
Identity management • Privacy-enhancing identity management is only possible w.r.t. parties which don‘t get GUIDs anyway, by • the communication network (e.g. network addresses) • the user device (e.g. serial numbers, radio signatures), • or even • the user him/herself (e.g. by biometrics).
Personal identifier 845 authorizes A: ___ A notifies 845: ___ 845 pays B € B certifies 845: ___ C pays 845 €
Role-relationship pseudonyms and transaction pseudonyms 762 authorizes A: __ A notifies 762: ___ 451 pays B € B certifies 451: ___ B certifies 314: ___ C pays 314 €
Pseudonyms: Linkability in detail Distinction between: 1. Initial linking between the pseudonym and its holder 2. Linkability due to the use of the pseudonym in different contexts
Pseudonyms: Initial linking to holder Public pseudonym: The linking between pseudonym and its holder may be publicly know from the very beginning. Initially non-public pseudonym: The linking between pseudonym and its holder may be know by certain parties (trustees for identity), but is not public at least initially. Initially unlinked pseudonym: The linking between pseudonym and its holder is – at least initially – not known to anybody (except the holder). Phone number with its owner listed in public directories Bank account with bank as trustee for identity,Credit card number ... Biometric characteristics; DNA (as long as no registers)
Pseudonyms: Use in different contexts => partial order number of an identity card, social security number, bank account pen name, employee identity card number customer number contract number one-time password, TAN A B stands for “B enables stronger anonymity than A”
Summing up Requirements for a multilaterally secure and privacy- enabling AmI world: • Make sure that others cannot gather „unnecessary data“ (just not gathering it is not enough, as history tells us). • Since trust in foreign infrastructures w.r.t. confidentiality properties (e.g. privacy) will be very limited at best, each human should have his/her trusted device(s) to provide for his/her security. This device might act in an ambient way in the interests of its owner. • Communication of humans with their ICT-environment should be by means of their trusted device only. • Develop trusted devices which have no identifying radio signature. • Minimize sensor abilities w.r.t. sensing foreign human beings directly.
Terminology and further reading http://dud.inf.tu-dresden.de/Anon_Terminology.shtml