1 / 23

Policy Auditing over Incomplete Logs: Theory, Implementation and Applications

Policy Auditing over Incomplete Logs: Theory, Implementation and Applications. Deepak Garg 1 , Limin Jia 2 and Anupam Datta 2 1 MPI-SWS (work done at Carnegie Mellon University) 2 Carnegie Mellon University. Privacy.

tosca
Download Presentation

Policy Auditing over Incomplete Logs: Theory, Implementation and Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Policy Auditing over Incomplete Logs: Theory, Implementation and Applications Deepak Garg1, Limin Jia2 and Anupam Datta2 1MPI-SWS (work done at Carnegie Mellon University) 2Carnegie Mellon University

  2. Privacy Personal information collected and used to provide services (census statistics, health care & financial services, targeted advertisements, social network services) Organizational goals and individual privacy goals conflict

  3. Privacy Legislation Regulate use and disclosure of personal information, but allow organizational work to proceed Examples: HIPAA (medical information), GLBA (financial information)

  4. Research Goal Build a principled system for enforcing practical privacy policies Non-trivial due to complexity of practical privacy policies

  5. Example from HIPAA Privacy Rule A covered entity may disclose an individual’s protected health information (phi) to law-enforcement officials for the purpose of identifying an individual if the individual made a statement admitting participating in a violent crime that the covered entity believes may have caused serious physical harm to the victim Pre-emptive enforcement (access control or runtime monitoring) does not suffice Black-and-white concepts • Concepts in privacy policies • Actions:send(p1, p2, m) • Roles:inrole(p2, law-enforcement) • Data attributes:attr_in(prescription, phi) • Temporal constraints: in-the-past(state(q, m)) • Purposes:purp_in(u, id-criminal)) • Beliefs:believes-crime-caused-serious-harm(p, q, m) Must rely on after-the-fact audit Grey concepts

  6. Audit-Based Approach This paper Collected by organizational databases Privacy law Organizational audit log Detect policy violations Prior work Computer-readable privacy policy in first-order logic Audit

  7. Outline of Talk • Introduction • Overview of Audit Algorithm • Details of Audit Algorithm • Formal Properties • Conclusion

  8. Challenge for Enforcement Audit Logs are Incomplete • Future: store only past and current events • Example: Timely data breach notification refers to future event • Subjective: no “grey” information • Example: May not records evidence for purposes and beliefs • Spatial: remote logs may be inaccessible • Example: Logs distributed across different departments of a hospital

  9. Abstract Model of Incomplete Logs • Model all incomplete logs uniformly as 3-valued structures • Define semantics (meanings of formulas) over 3-valued structures

  10. An Iterative Algorithm Check as much policy as possible on current logand output a residual policy. Iteratewhen log is extended with more information. Grey concepts checked by human auditor at any time.

  11. Reduce: The Iterative Algorithm reduce (L, φ) = φ' Logs reduce reduce φ1 φ2 φ0 Policy Time

  12. Example φ = ∀p1, p2, m, u, q, t. (send(p1, p2, m) ∧ tagged(m, q, t, u) ∧ attr_in(t, phi)) ⊃ inrole(p2, law-enforcement) ∧ purp_in(u, id-criminal) ∧  m’. (state(q, m’) ∧ is-admission-of-crime(m’) ∧ believes-crime-caused-serious-harm(p1, m’)) { p1 UPMC, p2 allegeny-police, m  M2, q  Bob, u  id-bank-robber, t  date-of-treatment } { m’  M1 } Log Jan 1, 2011 state(Bob, M1) Jan 5, 2011 send(UPMC, allegeny-police, M2) tagged(M2, Bob, date-of-treatment, id-bank-robber) φ' = T ∧ purp_in(id-bank-robber, id-criminal) ∧ is-admission-of-crime(M1) ∧ believes-crime-caused-serious-harm(UPMC, M1)

  13. Reduce: Formal Definition Initial policy must also pass a one-time, linear check called a mode check We have verified that the entire HIPAA and GLBA Privacy Rules pass this check c is a formula for which satisfying substitutions of x can be computed

  14. Formal Properties of Reduce Correctness

  15. Formal Properties of Reduce Complexity

  16. Formal Properties of Reduce Minimality of Output

  17. Implementation and Case Study • Implementation and evaluation over simulated audit logs for compliance with all disclosure-related clauses of HIPAA Privacy Rule • Performance: • Average time for checking compliance of each disclosure of protected health information is 0.16s for a 15MB log • Mechanical enforcement: • Reduce can automatically check 80% of all the atomic predicates

  18. Other Applications of Reduce • Runtime monitoring • For policies that do not mention future obligations or grey concepts • Advisory tool: Is an action allowed? • Run reduce on hypothetical log containing the action

  19. Closely Related Work • Runtime monitoring in MFOTL • [Basin et al ’10] • Pre-emptive enforcement • Efficient implementation • Assumes past-completeness of logs • Less expressive mode checking

  20. Closely Related Work • Iterative Model Checking • [Thati, Rosu ’05] • Propositional logic • Cannot express privacy legislation

  21. Conclusion • Iterative, interactive algorithm for policy audit • Checks as much policy as possible • Outputs residual policy • Provably correct, efficient, optimal • Works with incomplete logs • Expressive for real privacy laws

  22. Questions?

  23. The Case for Audit • Run-time access control mechanisms are not sufficient to enforce privacy policies • Purposes & beliefs (“grey” concepts on previous slide) • And also future obligations (e.g., notice of data breach should go out within 30 days) • Human input is essential for resolving grey concepts

More Related