200 likes | 215 Views
Delve into the legal landscape of data protection and traceability in telcos driving distributed systems, highlighting challenges and possible solutions in data security and privacy enforcement. Explore the evolving intersection of law, regulation, and technology, addressing complexities in data collection and anonymization. Engage in discussions on information law roles, orchestration services, resilience, and the balance between centralization and decentralization. Gain insights on traceability of personal/non-personal data, differential privacy, and data analytics countermeasures. Explore the impact of temporal and physical data characteristics, privacy in decentralized systems, and the traceability trade-offs. Examine the economic aspects of accessing individual data records.
E N D
Law, Regulation, Traceability 3 Groups‘ Notes from 3 Meetings Notes taken by Burkhard Stiller In full adapted by group members
Indra Spiecker, Christoph Sorge, Burkhard Stiller, Edgar Weippl
Discussion and Start • Telcos as drivers for distributed systems, heavily regulated • Law, data protection, data security & privacy, with little regulation so far, normative void • E.g., Opt-in, opt-out technical option vs. judicial basis of data privacy • Huge enforcement deficit • Supervision/court control clear jurisdiction not marked explicitly • Regulation and legal basis become more “relevant” today • Big data erases binding of original data protection • Rule of law bound data collection to dedicated targets, but Big Data turns that around • Anonymization does not work in full • Some people show interest in de-anonymization of public data sets only (technical/theoretical challenge)
Continued Discussion (1) • Information law – roles and problems • Normative/abstract vs. enumerative/declarative views • Which data are to be made accessible? • Computer science and technical data protection are not coherent with legal normative standards • Legal approach under uncertainty/risk: If results cannot be checked, the “process” could be controlled • Opt-out, a reliable approach for all services? Use of anonymization services? Requirement of a privacy/security-friendly service (Facebook example) • Facebook – “simple” approach, but Facebook does not allow for “unevaluated” participation. Leads to competition only in case of network effects. • Assisted driving for rental cars: “manipulated” data (noise) inclusion possible to reach a “standardized” profile?
Continued Discussion (2) • Laws in an abstract level, specifically IT security legislation possible, in general? • Are orchestrated services a danger for resilience? Mobility car sharing depends on operational mobile network and other decentralized network services (Vienna example) • Smart Grids with a centralized dependencies on decentralized components • Solutions for decentralized control in place and theoretically operational • Data and compliance checks of file formats and side effects • Systemic effects? Useful from the financial markets domain? • Complexity in distributed networks and their supported applications and their users‘ behavior • Phsychological effects?
Summary • Resilience • Central vs. decentralized systems • Relation law and technology • Driver and follower • Abstract laws tend to address IT requirements “better” • Enforcement and decision support are required • Jurisdictional hierarchy/complexity • International, European, national perspectives
Thilo Ewald, Indra Spiecker, Christoph Sorge, Burkhard Stiller, Gene Tsudik
Traceability of Personal/Non-personal Data in Service Provision • Discussion in the context of individual identification • Relation to Big Data • Is there a way to “irritate” data analytics to provide countermeasures against identifying an individual? • Self-protection? • Differential privacy tries to achieve that, depending on the data structure themselves • Individual applications define the prerequisite • It seems that for general applications this is impossible • Additional context information may result in new outcomes • Correlation vs. statistical analysis/probabilistics • Temporal (time stamps) and physical (objectives) characteristics • Extreme value of those two types of information • Virtuality tends to be able to provide “more” anonymity
Traceability (1) • Trade-offs • Hygiene and money? • Cash is not clean, but anonymous • Japanese money washing machines for gift money • New information may make different noise relevant • Medical records • Add noise to existing records, but “new” diseases to be added changes the statistics • Correctness, authenticity, and accuracy • Not “known” in general in Big Data • Chaff is used in communications to hide interactions • Smart Grids generate additional data and messages to change the granularity (aggregation in time) • Local storage can “hide” current usage information, unless the storage unit is empty, autonomy increases, too • “Solves” privacy concerns to a certain extend
Traceability (2) • Decentralization • Better for privacy, less easier for security • Overlay networks and the balancing of autonomy vs. secrecy • Trade-off: autonomy decouples from central control and guidance, less efficient operations • How to identify the source of information? • Origin, trace legal/illegal inclusion, processing • Data bases: providence? • Imaginable that devices have a “TPM” (Trusted Platform Module) • Collusion countermeasures may be possible for “manual” communication behavior • Changes done on purpose vs. errors happening
Economics • Business model for accessing (individual) data records in a Big Data set? • Tracing the origin of data (individual) • Price for using information from a Big Data set • Purpose of goal changed • “1 c” received back from the data collector for 1 $ spent • Efficient breach of contract in the US is very well understood, in contrast to the EU? • No means to control the break of a contract in general • Consumer protection agencies tend to have the right to consider the details of contracts
Law • There is a “right” to store correct data in a system • Google: Using the search for free – is that an implicit set-up of a contract between the searcher and Google? Thus, is the use of data of the search request “legal”? • Monopoly, solution antitrust and consumer protection law • Change of contractual conditions from the provider’s side • Company control is not fully established in Germany, as “only” e-government checks are prioritized in a number of states • Hamburg forms the exception with 12 people working on checking larger, world-wide company activities
Sean Smith, Christoph Sorge, Indra Spiecker, Burkhard Stiller
Law and Regulation • Law and regulation may determine an expected set of mandatory guidelines • Standardization forms guidelines, too • Desired goals from technology side: certainty, clear normative standards, foreseeability, final decision maker • EC directives and regulations • Directives define the goals for Member States and the means are free • Regulations are directly binding in all Member States • Legal instruments • ex ante or ex post regulations • Numbers, thresholds, numeric values, standards • Definitions, different understandings of technology and law (example: German media law)
Privacy Aspects • Storing Web Server logs and telecommunication service data • Definitions differ in German and Swiss laws • Anonymization has a history of not working • Netflix case, US health records, a German case in the 80s – all using the anomymized data set and an additional data set • US privacy act • Including feasibility of computation (which is changing over time) • Law and reality of computers tend to not match always, law and NIST standards tend to interrelate and need testing, which is – in general – hard to maintain
Law and Technology (1) • Change of technology happens weekly • Standards updates take time, too • Updates of law and regulation take more time • Pace makers may differ: executive level, parliament • Change and updates of requirements are not “nice” and need to be avoided • Governing and enforcement are still needed • Foreseeability, stability, ... • Lawmakers and IT experts interact to make “good“ laws? • Lawmakers shall not be puppets of IT, but following a democratic approach and processes • Normative conflicts remain • Time is crucial in expert hearings for laws, where experts are seeing a short time slot for an entire law draft feedback loop
Law and Technology (2) • Google Earth example • When does a picture provide personal data? Scale of 1:10.000 and smaller means to have no personal data, which equals more than 40 cm per pixel • Bundesnetzagentur example • Price cap model and other models can be used [consumer basket], but in practice only one model applied, contraction to legal rule? • Numbers and thresholds are very good to have in laws/regulations, but they are hard to find and may not apply in all real cases • Privacy conflicts for stakeholders involved (all own interests) • State, public population, terrorism, vendors, researchers • Laws determine a cultural “standard” • Enforcement may not prevent people from making it happen
Guaranteeing Security • IT security act in Germany in planning • Disclosing attacks or leakages • US approach seems to address the health sector at this stage for guaranteeing security • US judicial decisions in that sense seem to have unintended effects • Oracle vs. Google case • Legislators may not be that trustworthy, compared to elder generations, which tends to be similar in some judge appointment cases • The H. Clinton case of using a non-authorized, private device instead of a government-approved one
Conflicts • Selected normative conflicts • Different nations and regions • Case of health data and its handling between different US states • Different interpretations of similar laws in different German states • The US knows the principle of “discovery”, which is unknown in Europe, ediscovery and TTIP (Transatlantic Trade and Investment Partnership) results in a US-European conflict • Encryption: key escrow issues (export regulations), Steven Levy’s book on Crypto
Political Standards Law Effects • DES and encryption story from NIST • Backdoor issues • AES (from Belgian and Dutch) was developed in an open process • Recent competition from NIST resulted in a European winner, after the adoptions from NIST, parameters were restricted to certain settings, generating an outcry of the community