1 / 73

Computer Security Access Control Matrices

Computer Security Access Control Matrices. Fall 2006. Based on slides provided by Matt Bishop for use with Computer Security: Art and Science. Based on adaptations by Carl Gunter. Overview. Access control matrices and state transitions on them Harrison-Ruzzo-Ullman result Corollaries

Download Presentation

Computer Security Access Control Matrices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer SecurityAccess Control Matrices Fall 2006 Based on slides provided by Matt Bishop for use with Computer Security: Art and Science Based on adaptations by Carl Gunter

  2. Overview • Access control matrices and state transitions on them • Harrison-Ruzzo-Ullman result • Corollaries • Take-Grant Protection Model • SPM and successors

  3. Required • Reading • Chapter 2, as needed • Section 3.1 • Section 3.2 through the proof of Theorem 3-2 • All of Section 3.3 through the paragraph after Definition 3-6 • The example in Section 3.3 after Corollary 3-2 • Exercises: From 3.9 do 1, 4.

  4. Access Control • Controlling access is a fundamental security problem • Access control policy expresses who is authorized to do what • Read files • Modify data • Access services • Change access

  5. Subjects S = { s1,…,sn } Objects O = { o1,…,om } Rights R = { r1,…,rk } Entries A[si, oj] R A[si, oj] = { rx, …, ry } means subject si has rights rx, …, ry over object oj objects (entities) o1 … oms1 … sn s1 s2 … sn subjects Access Control Matrices

  6. Example 1: File System

  7. Example 2: CSL • Unlock - right to unlock a door • Log - request entry logs from a door

  8. State Transitions • Change the protection state of system • |– represents transition • Xi|– Xi+1: command  moves system from state Xi to Xi+1 • Xi|– *Xi+1: a sequence of commands moves system from state Xi to Xi+1 • Commands often called transformation procedures

  9. Primitive Operations • create subjects; create object o • Creates new row, column in ACM; creates new column in ACM • destroy subjects; destroy object o • Deletes row, column from ACM; deletes column from ACM • enterrintoA[s, o] • Adds r rights for subject s over object o • deleterfromA[s, o] • Removes r rights from subject s over object o

  10. Create Subject • Precondition: sS • Primitive command: create subjects

  11. Create Object • Precondition: oO • Primitive command: create objecto

  12. Add Right • Precondition: pS, yO • Primitive command: enterrintoa[p, y]

  13. Delete Right • Precondition: pS, yO • Primitive command: deleterfroma[p, y]

  14. Destroy Subject • Precondition: sS • Primitive command: destroysubjects

  15. Destroy Object • Precondition: oO • Primitive command: destroyobjecto

  16. Creating File • Process p creates file f with r and w permission command create•file(p, f) create object f; enter own into A[p, f]; enter r into A[p, f]; enter w into A[p, f]; end

  17. Own Right • Usually allows possessor to change entries in ACM column • So owner of object can add, delete rights for others • May depend on what system allows • Can’t give rights to specific (set of) users • Can’t pass copy flag to specific (set of) users

  18. Mono-Operational Commands • Make process p the owner of file g command make•owner(p, g) enter own into A[p, g]; end • Mono-operational command • Single primitive operation in this command

  19. Conditional Commands • Let p give qr rights over f, if p owns f command grant•read•file•1(p, f, q) if own in A[p, f] then enter r into A[q, f]; end • Mono-conditional command • Single condition in this command

  20. Multiple Conditions • Let p give qr rights over f, if p has rights r and c over f command grant•read•file•2(p, f, q) if r in A[p, f] and c in A[p, f] then enter r into A[q, f]; end

  21. Copy Right • Allows possessor to give rights to another • Often attached to a right, so only applies to that right • r is read right that cannot be copied • rc is read right that can be copied • Is copy flag copied when giving r rights? • Depends on model, instantiation of model

  22. Attenuation of Privilege • Principle says you can’t give rights you do not possess • Restricts addition of rights within a system • Usually ignored for owner • Why? Owner gives herself rights, gives them to others, deletes her rights.

  23. Key Points • Access control matrix simplest abstraction mechanism for representing protection state • Transitions alter protection state • 6 primitive operations alter matrix • Transitions can be expressed as commands composed of these operations and, possibly, conditions

  24. Proving Safety • Want to prove system “safe” or “secure” • What does that mean? • Subjects should only have “authorized” rights • E.g. no one except for me can write to my home directory • Easy to check in any given protection state • What about the dynamic protection system?

  25. Formalizing Safety • Adding a generic right r where there was not one is “leaking” • If a system S, beginning in initial state s0, cannot leak right r, it is safe with respect to the right r. • General property, can simulate: • Leaking a right r on a specific object o • Leaking r to a subject outside a “trusted” set

  26. Safety Question • Does there exist an algorithm for determining whether a protection system S with initial state s0 is safe with respect to a generic right r? • Here, “safe” = “secure” for an abstract model

  27. General Case • Answer: no • Sketch of proof: Reduce halting problem to safety problem Turing Machine review: • Infinite tape in one direction • States K, symbols M; distinguished blank b • Transition function (k, m) = (k, m, L) means in state k, symbol m on tape location replaced by symbol m, head moves to left one square, and enters state k • Halting state is qf; TM halts when it enters this state Harrison, Ruzzo, Ullman 76

  28. Mapping 1 2 3 4 s1 s2 s3 s4 A B C D … s1 A own head s2 B own s3 C k own Current state is k s4 D end

  29. Mapping 1 2 3 4 s1 s2 s3 s4 A B X D … s1 A own head s2 B own s3 X own After (k, C) = (k1, X, R) where k is the current state and k1 the next state s4 D k1 end

  30. Command Mapping (k, C) = (k1, X, R) at intermediate becomes command ck,C(s3,s4) ifowninA[s3,s4] andkinA[s3,s3] and C inA[s3,s3] then deletekfromA[s3,s3]; delete C fromA[s3,s3]; enter X intoA[s3,s3]; enterk1intoA[s4,s4]; end

  31. Mapping 1 2 3 4 5 s1 s2 s3 s4 s5 A B X Y b s1 A own head s2 B own s3 X own After (k1, D) = (k2, Y, R) where k1 is the current state and k2 the next state s4 Y own s5 bk2 end

  32. Command Mapping (k1, D) = (k2, Y, R) at end becomes command crightmostk,C(s4,s5) ifendinA[s4,s4] andk1inA[s4,s4] and D inA[s4,s4] then deleteendfromA[s4,s4]; create subjects5; enterown into A[s4,s5]; enterendintoA[s5,s5]; deletek1fromA[s4,s4]; delete D fromA[s4,s4]; enter Y intoA[s4,s4]; enterk2intoA[s5,s5]; end

  33. Rest of Proof • Protection system exactly simulates a TM • Exactly 1 end right in ACM • 1 right in entries corresponds to state • Thus, at most 1 applicable command • If TM enters state qf, then right has leaked • If safety question decidable, then represent TM as above and determine if qf leaks • Implies halting problem decidable • Conclusion: safety question undecidable

  34. Mono-Operational Commands • Answer: yes • Sketch of proof: Consider minimal sequence of commands c1, …, ck to leak the right. • Can omit delete, destroy • Can merge all creates into one Worst case: insert every right into every entry; with s subjects and o objects initially, and n rights, upper bound is k ≤ n(s+1)(o+1)

  35. Take-Grant Protection Model • A specific (not generic) system • Set of rules for state transitions • Safety decidable, and in time linear with the size of the system • Goal: find conditions under which rights can be transferred from one entity to another in the system Jones, Lipton, Snyder 76

  36. System Оobjects (files, …) l subjects (users, processes, …)  don't care (either a subject or an object) G |–x G' apply a rewriting rule x (witness) to G to get G' G |–* G' apply a sequence of rewriting rules (witness) to G to get G' R = { t, g, r, w, … } set of rights

  37. Rules    l l |-   take t t        |- grant g   g l l

  38. More Rules |-   create l l |-  –   l l remove These four rules are called the de jure rules

  39. Example: Shared Buffer • Initially s has grant rights for processes p and q. • S sets up a shared buffer for p,q with the following steps • s creates ({r,w} to new object) b • s grants ({r,w} to b) to p • s grants ({r,w} to b) to p

  40.  tg g ¡ v Symmetry x  y  l  l |–  t t  l l z x creates (tg to new) v z takes (g to v) from x z grants (a to y) to v x takes (a to y) from v Similar result for grant

  41. Islands • tg-path: path of distinct vertices connected by edges labeled t or g • Call them “tg-connected” • island: maximal tg-connected subject-only subgraph • Any right one vertex has can be shared with any other vertex

  42. Example s q t r p s'     g t t g g t      y u v x w

  43. can•share Predicate Definition: can•share(r, x, y, G0) if, and only if, there is a sequence of protection graphs G0, …, Gn such that G0 |–* Gn using only de jure rules and in Gn there is an edge from x to y labeled r.

  44. can•share Properties • If x and y are subjects in an island, then can•share(r, x, y, G0) • Proof by induction using the properties of tg-connected subjects • General result: can•share(r, x, y, G0) is decidable using an algorithm of complexity O(|V| + |E|) where V and E are the vertices and edges in the graph • Proof omitted. Sketch given at the end of 3.3.1.

  45. Computer SecurityMandatory Access Control Fall 2006 Based on slides provided by Matt Bishop for use with Computer Security: Art and Science

  46. Confidentiality Policy • Goal: prevent the unauthorized disclosure of information • Deals with information flow • Integrity incidental • Multi-level security models are best-known examples • Bell-LaPadula Model basis for many, or most, of these

  47. Bell-LaPadula Model, Step 1 • Security levels arranged in linear ordering • Top Secret: highest • Secret • Confidential • Unclassified: lowest • Levels consist of security clearance L(s) • Objects have security classification L(o)

  48. Example • Tamara can read all files • Claire cannot read Personnel or E-Mail Files • Ulaley can only read Telephone Lists

  49. Reading Information • Information flows up, not down • “Reads up” disallowed, “reads down” allowed • Simple Security Condition (Step 1) • Subject s can read object o iff, L(o) ≤ L(s) and s has permission to read o • Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission) • Sometimes called “no reads up” rule

  50. Writing Information • Information flows up, not down • “Writes up” allowed, “writes down” disallowed • *-Property (Step 1) • Subject s can write object o iff L(s) ≤ L(o) and s has permission to write o • Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission) • Sometimes called “no writes down” rule

More Related