490 likes | 517 Views
Security Policy. CSCI283/172 Fall 2008 GWU Draws extensively from: Memon’s notes, Brooklyn Poly Pfleeger Text, Chapter 5 Bishop’s text, Chapter 4, Bishop’s slides, Chapter 4. Security Services in an OS. A general purpose OS provides the following security mechanisms: Memory protection
E N D
Security Policy CSCI283/172 Fall 2008 GWU Draws extensively from: Memon’s notes, Brooklyn Poly Pfleeger Text, Chapter 5 Bishop’s text, Chapter 4, Bishop’s slides, Chapter 4
Security Services in an OS • A general purpose OS provides the following security mechanisms: • Memory protection • File protection • General object protection • Access authentication • How do we go about designing a “trusted” OS (one that we believe provides the above)? • We prefer the term “trust” as opposed to “secure”. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Need • Policy: description of requirements • Model: policy representation • Design: implementation of policy • Trust: based on features and assurance CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Trust • Policies, mechanisms and procedures make assumptions and one trusts these assumptions hold. • SA receives security patch and installs it. Has she increased the security of the system? • Aspirin from drugstore is considered trustworthy. On what basis? CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Trust: Example 1 Administrator installs patch • Trusts patch came from vendor, not tampered with in transit • Trusts vendor tested patch thoroughly • Trusts vendor’s test environment corresponds to local environment • Trusts patch is installed correctly CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Trust: Example 2 Aspirin from drugstore is considered trustworthy. The basis of this trust is: • Testing and certification by FDA. • Manufacturing standard of company and regulatory mechanisms that ensure it. • Safety seal on the bottle. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Trust: Example 3 Formal Verification: • Gives formal mathematical proof that given input i, program P produces output o as specified • Suppose a security-related program S has been formally verified to work with operating system O • What are the assumptions? CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Trust in Formal Methods The assumptions are: • Proof has no errors • Bugs in automated theorem provers • Preconditions hold in environment in which S is to be used • S transformed into executable S whose actions follow source code • Compiler bugs, linker/loader/library problems • Hardware executes S as intended • Hardware bugs CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Qualities of Security and Trustedness CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Security Policy Consider a computer system to be a finite state automaton with state transitions. Then • A security policy is a statement that partitions the states of a system into authorized and unauthorized states. • A secure system is a system that starts in an authorized state and cannot enter an unauthorized state. • A breach of security occurs when a system enters an unauthorized state. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example: Secure System? Example: Secure System? t1 t1 S3 S3 t3 t3 S1 S1 t2 t2 t7 t7 S2 S2 S5 S5 t4 t4 S4 S4 t6 t6 CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Mechanism and Policy • Example: University policy disallows cheating – copying another students homework assignment. Student A has her homework file world-readable. Student B copies it. Who has violated policy? • Mechanism should not be confused with policy: A security mechanism is an entity or procedure that enforces some part of a security policy. • A trusted system is expected to enforce required security policies CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Confidentiality Confidentiality: Let X be a set of entities and I be some information. Then I has the property of confidentiality with respect to X if no member of X can obtain information about I. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Integrity Integrity:Let X be a set of entities and I some information or a resource. Then I has the property of integrity with respect to X if all members of X trust I. I can be • data (data integrity) • source of data (data authentication) • resource (assurance) CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Availability Availability:Let X be a set of entities and I a resource. Then I has the property of availability with respect to X if all members of X can accessI. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example • Let X be the set of students in CS283. • Let I be: • the set of student grades in the test • the set of class notes on the website • the US’s military policy in Iraq • For all the above, in usual policy, does I enjoy confidentiality, integrity and accessibility wrt X? • What if: X is a single student in CS283, and I this student’s grade in the last exam. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Elements of a Security Policy • A security policy considers all relevant aspects of confidentiality, integrity and availability. • Confidentiality policy: Identifies information leakage and controls information flow. • Integrity Policy: Identifies authorized ways in which information may be altered. Enforces separation of duties of individuals to ensure robustness against single corrupt individuals. • Availability policy: Describes what services must be provided: example – a browser may download pages but no Java applets. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Security Model A security model is one that represents a particular policy or set of policies Important to abstract out specific characteristics, and to restrict the policies covered, as nothing non-trivial can be said about the class of all policies. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Types of Security Policies • A military security policy (also called government security policy) is a security policy developed primarily to provide confidentiality. • Not worrying about trusting the object as much as disclosing the object • A commercial security policy is a security policy developed primarily to provide a combination of confidentiality and integrity. • Focus on how much the object can be trusted. • Also confidentiality policy and integrity policy. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Types of Access Control • Discretionary Access Control (DAC, IBAC) • individual user sets access control mechanism to allow or deny access to an object • Mandatory Access Control (MAC) • system mechanism controls access to object, and individual cannot alter that access • Originator Controlled Access Control (ORCON) • originator (creator) of information controls who can access information CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Secure, Precise Mechanisms Can one devise a procedure for developing a mechanism that is both secure and precise? • Consider confidentiality policies only here, integrity policies produce same result CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Secure, Precise MechanismsMath. Definition Program a function with multiple inputs and one output Let p be a function p: I1 ... InR Then p is a program with: n inputs ikIk, 1 ≤ k ≤ n, and one output rR CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
The functionExplanation of math definition The function is what needs to be provided to the user p is what the user needs to obtain CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Programs and Postulates • Observability Postulate: the output of a function encodes all available information about its inputs • Covert channels considered part of the output • Example: authentication function • Inputs name, password; output Good or Bad • If name invalid, immediately print Bad; else access database • Problem: time output of Bad, can determine if name valid • This means timing is part of output CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Protection MechanismMath. Definition p: I1 ... InR A protection mechanism m for p is a function m: I1 ... InRE such that when ikIk, 1 ≤ k ≤ n, either m(i1, ..., in) = p(i1, ..., in) or m(i1, ..., in) E. E is set of error outputs, m defines what errors are allowed CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Protection MechanismExplanation of math definition The protection mechanism m for p provides either: • the value of p or • an error It can provide nothing else: no other function, for example The error may be a security-related, parameter-related (e.g. divide by zero) or a reliability-related error CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Protection mechanismExplanation of math definition p(i1 i2 ….. in) p(i1 i2 ….. in) i1 i2 ….. in m p error CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
General Security Policy Example: Reveal a value if it is smaller than 5 Does not say anything about the function’s input parameters, only about its output CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example 1 p(user, password, D) = Good - if (user, password) D = Bad else m(user, password, D) = Good - if (user, password) D = Bad if (user, password) D = error else, where e E E = { “Password Database Missing”, “Password Database Locked” } CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example 2 p(n, Mm, Mf, nm, nf, s1, g1, s2, g2, …. sn, gn) = (nm Mm + nf Mf)/ (nm + nf ) nm+nf0 E= {“nm + nf =0”} N= total number of records Mm = average salary of men Mf = average salary of women nm = number of men nf = number of women si = salary of ith record gi = gender of ith record CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Confidentiality PolicyMath. Definition Confidentiality policy for program p says which inputs can be revealed c: I1 ... InA where AI1 ... In is set of inputs available to observer CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Confidentiality PolicyExplanation of math definition Confidentiality policy for program p says which inputs can be revealed to a mechanism that determines the function, i.e. the inputs are not directly revealed to the user Only some values i1 i2 ….. in c CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Our example Suppose c(i1, i2, i3) = allow(i1, i2, i3) = i1, i2 i.e. allow(user, password, D) = (user, password) CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example 2 c(n, Mm, Mf, nm, nf, s1, g1, s2, g2, …. sn, gn) = CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Security mechanismMath. Definition m: I1 ... InR E msecure iff • m´: A RE such that, for all ikIk, 1 ≤ k ≤ n m(i1, ..., in) = m´(c(i1, ..., in)) m returns values consistent with c CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Security mechanismExplanation of math definition Is there some m’ which can operate only on the values allowed by the confidentiality policy c and still produce m? CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Security mechanismExplanation of math definition p(i1 i2 ….. in) p(i1 i2 ….. in) i1 i2 ….. in m p error = p(i1 i2 ….. in) Only some values i1 i2 ….. in c ?m’ error CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Our example Not possible to find a m’ Hence m not secure as an enforcement mechanism of p for policy allow CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example 2 m’() = ? • What if m had been additionally restrictive? Suppose m also gives errors when either nm or nf is zero? • Suppose it is unnecessarily restrictive: suppose m gives errors when the mean value is too small? Or when n is too small? CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
PrecisionMath. Definition m1, m2 distinct protection mechanisms for program p under policy c m1as preciseasm2 (m1≈ m2) if, for all inputs i1, …, in, m2(i1, …, in) = p(i1, …, in) m1(i1, …, in) = p(i1, …, in) m1more precise thanm2 (m1 ~ m2) if there is an input (i1´, …, in´) such that m1(i1´, …, in´) = p(i1´, …, in´) and m2(i1´, …, in´) ≠ p(i1´, …, in´). CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
PrecisionExplanation of math definition m1as preciseasm2 (m1≈ m2) means that, if m2 does not give an error, m1 does not. Think of m1 as at least as precise as m2 If m2 does give an error, this says nothing about what m1 does: it may or may not give an error m1more precise thanm2 (m1 ~ m2) means that there are some places where m2 gives an error, but m1 does not, i.e. it says that when m2 gives an error, at least one time m1 does not CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
More Examples • c(i1, ..., in) = C, a constant • Deny observer any information (output does not vary with inputs) • c(i1, ..., in) = (i1, ..., in), and m´ = m • Allow observer full access to information • c(i1, ..., in) = i1 • Allow observer information about first input but no information about other inputs. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example Let m1 be the mean of the salaries giving an error message only when n =0 Let m2 be the mean of the salaries giving an error message when n is small Let m3 be the mean of the salaries giving an error message when the mean is small Then: m1≈ m2, m1≈ m3, m1 ~ m2, m1 ~ m3 And m2 and m3 cannot be compared. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Combining Mechanisms • m1, m2 protection mechanisms • m3 = m1m2 • For inputs on which either m1 and m2 return same value as p, m3 does also; otherwise, m3 returns same value as m1 • Theorem: if m1, m2 secure, then m3 secure • Also, m3 ≈ m1 and m3 ≈ m2 • Follows from definitions of secure, precise, and m3 CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example: m2m3 m2 gives an error when n 1 m3 gives an error when average salary 50,000 What is m2m3 ? (Start with an example: s1 = 300,000 and s2 = 30,000) CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example p(n, Mm, Mf, nm, nf, s1, g1, s2, g2, …. sn, gn) = (nm Mm + nf Mf)/ (nm + nf ) m2 gives an error when nm Nm m3 gives an error when nf Nf m2m3 = ? How does it change with the values of Nm and Nf ? CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Existence Theorem For any program p and security policy c, there exists a precise, secure mechanism m* such that, for all secure mechanisms m associated with p and c, m* ≈ m • Maximally precise mechanism • Ensures security • Minimizes number of denials of legitimate actions CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Example Previous example as values of Nm and Nf change CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set
Lack of Effective Procedure There is no effective procedure that determines a maximally precise, secure mechanism for any policy and program. CS283-172/Fall08/GWU/Vora/ Many slides from Bishop's slide set