510 likes | 647 Views
Systematic Testing and Verification of Security Policies. Tao Xie Department of Computer Science North Carolina State University https://sites.google.com/site/asergrp/projects/policy. Joint Work with Vincent Hu, Rick Khun , and ACTS group (NIST)
E N D
Systematic Testing and Verification of Security Policies Tao Xie Department of Computer Science North Carolina State University https://sites.google.com/site/asergrp/projects/policy Joint Work with Vincent Hu, Rick Khun, and ACTS group (NIST) JeeHyun Hwang, Evan Martin (NCSU), Alex Liu (MSU)
Motivation • Digital information is • Easy to access • Easy to search • Sensitive information requires access control mechanisms • Security policies are popularly in access control • Access control policies for applications • Firewall policies for networks
Motivation - cont. • How to ensure the correct specification of security policies? • What you specify is what you get, but not necessarily what you want • Solution: systematic testing and verification of security policies
Example Access Control Policy • Subjects: Student, Faculty • Actions: Assign, Receive • Resources: Grades Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny
Policy Verification • Verify policy against specified property What properties can you come up for this policy? Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny
Policy Verification Property: student can never assign grades Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny Violated with a counterexample request: faculty|student assign grades
Policy Verification “when the specification language is sufficiently declarative, users have great difficulty providing a duplicate statement of behavior.” --- Shriram Krishnamurthi [RiseandRise 08] Rule 1: IF (faculty AND assign AND grades) Permit Rule 2: IF (student AND receive AND grades) Permit Rule 3: OTHERWISE Deny
Our Approaches • Systematic policy verification • Property inference [POLICY 06, SSIRI 09, DBSec 10] • Property-quality assessment [ACASC 08] • Properties derived from access control models [POLICY 10DE] • Systematic policy testing • Structural coverage criteria [ICICS 06] • Fault models/mutation testing [WWW 07] • Test generation [SESS 07] • Policy engine performance [SIGMETRICS 08, TC] • Policy engine correctness [TAV-WEB 08] • Firewall policy testing/fixing [SRDS 08/09, LISA 10] • XACML policies • XACML engines • Firewall policies
XACML • A standard access control policy language used to express access control policies • who can do what when • A request/response language used to express • queries about whether access should be allowed (requests) and • answers to those queries (responses) • http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=xacml
A Policy Set holds other policies or policy sets. A policy is expressed as a set of rules. Rules have targets and a set of conditions that determine if the rule applies to a given request. Both rule and policy combining algorithms exist to reconcile conflicts. XACML Policy Structure target policy target target rule1 rule2 cond1 cond2 • http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=xacml
A Simple Scenario • A Subject who wishes to perform an Action on a Resource must do so through a PEP. • The PEP forms the XACML request and sends it to the PDP. • The PDP checks the request against the Policy and returns an XACML response. • The PEP either Permits or Denies access to the resource.
Test Inputs Test Outputs Expected Outputs Program Software Testing Policy Testing Software Testing Requests Responses Expected Responses Policy Policy Testing
Research Problems and Solutions • Test generation - Request generation • Policy Coverage Criteria • Random request generation • Request generation based on change-impact analysis • Mutation testing to assess fault-detection capability • Test-result inspection - Response inspection • Request selection and minimization based on structural coverage
Structural Policy Coverage Criteria policy covered if target matches target policy rule2 covered if target matches target target rule1 rule2 Condition must evaluate to True and False to be covered entirely cond1 cond2
1 0 0 0 0 0 1 1 0 1 Random Request Generation • The example policy: • Subjects: Student, Faculty • Actions: Assign, Receive • Resources: Grades • Model the set of attribute values as a vector of bits and randomize the bits Student Faculty Assign Receive Grades
Cirg: Change-Impact Request Generation counterexamples policy change-impact analysis 3. requestgeneration policy versions version synthesis requests
Cirg Example IF (faculty AND assign AND grades) ELSE IF (student AND receive AND grades) Permit ELSE Deny • Counter-example • faculty, assign, grades : Permit Deny Deny Permit
Synthesized Versions Rationale: synthesize two versions whose differences are coverage targets • All-to-Empty • One-to-Empty • One-Increment • All-to-Minus-One • All-to-Change-One-Effect
Margrave – Change-Impact Analysis Tool Multi-Terminal Decision Diagrams • Faculty (f) can assign (a) grades (g) • Students (s) can receive (r) grades (g) [Fisler et al. ICSE 05]
Margrave Sample Output 1:/Subject, role, Faculty/ 2:/Subject, role, Student/ 3:/Resource, resource-class, ExternalGrades/ 4:/Resource, resource-class, InternalGrades/ 5:/Action, command, Assign/ 6:/Action, command, View/ 7:/Action, command, Receive/ 8:/Subject, role, TA/ 12345678 { 00010101 N->P 00011001 N->P 00100101 N->P 00101001 N->P 01010101 N->P 01011001 N->P 01100101 N->P 01101001 N->P }
Software Mutation Testing Test Inputs Test Outputs Program Mutation Operators Mutator Differ? Mutant Killed! Mutant Outputs Mutant Program
Policy Mutation Testing Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy
Components of Mutation Testing Framework Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy
Research Questions • Does test selection based on structural coverage criteria produce request sets with high fault-detection capability? • What are the individual characteristics of each mutation operator? • Are some more difficult to kill than others? • Are some easily killed by request sets selected based on structural coverage criteria?
Sample Policies • continue: 51 policies, 56 rules
# of Requests Generated and Selected • continue: 373 (cirg), 500 (random), 32 (reduction)
Coverage Results • continue: 32% RuleCov (random) vs. 98% RuleCov(cirg)
Mutation Operators, Mutation, and Equivalent Mutant Detection Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy
Mutation Operators • Each operator mutates a different policy element: policy set, policy, rule, condition, and/or their associated targets and effects.
Change Rule Effect (CRE) Example IF (faculty AND assign AND grades) ELSE IF (student AND receive AND grades) Permit ELSE Deny • The CRE mutation operator is performed on each rule and changes the decision effect (Permit Deny) Deny Permit
Equivalent Mutant Detection • An equivalent mutant is semantically equivalent although syntactically different than the original policy. • They provide no value and waste resources. • We use change-impact analysis to detect equivalent mutants and remove them.
Request Evaluation and Mutant Detection Requests Responses Policy Mutation Operators Mutator Differ? Mutant Killed! Mutant Responses Mutant Policy
Sun’s XACML implementation • An open source implementation of the XACML standard in Java • Developed by Sun as part of an ongoing project on Internet Authorization in the Internet Security Research Group • http://sunxacml.sourceforge.net/
Our Approaches • Systematic policy verification • Property inference [POLICY 06, SSIRI 09, DBSec 10] • Property-quality assessment [ACASC 08] • Properties derived from access control models [POLICY 10DE] • Systematic policy testing • Structural coverage criteria [ICICS 06] • Fault models/mutation testing [WWW 07] • Test generation [SESS 07] • Policy engine performance [SIGMETRICS 08, TC] • Policy engine correctness [TAV-WEB 08] • Firewall policy testing/fixing [SRDS 08/09, LISA 10] • XACML policies • XACML engines • Firewall policies
Firewall Policy Structure • A Policy is expressed as a set of rules. • ARule is represented as <predicate> → <decision> • <predicate>is a set of <clauses> • An example firewall policy A range in each field refers <clause> Rule r1’s <decision> Rule r1’s <predicate>
Structural Coverage Definition Rationale: when the policy part with a fault is not evaluated (i.e., “covered”), the fault is often not exposed. Rule coverage of a policy P by packets T = #rules evaluated by at least one packet in T #rules in P Predicate coverage of a policy P by packets T #predicates evaluated to true or false by T at least once 2 ×#predicates in P Clause coverage of a policy P by packets T #clauses evaluated to true or false by T at least once 2 ×#clauses in P
Test Packet Generation Our objective: generating packets for achieving high structural coverage Random Packet Generation Randomly selects values for a packet Packet Generation based on Local Constraint Solving Considering individual rules in a policy Packet Generation based on Global Constraint Solving Considering multiple rules in a policy
Experiments (measuring coverage) • Test 14 firewall policies • Generate packets by our proposed three techniques • Measure structural coverage.
Experiments (measuring fault detection capability) • We also used reduced packet sets (maintaining the same level of structural coverage with the corresponding original packet set)
NCSU/NIST ACPT Architecture • GUI allows specification of users, groups, attributes, roles, rules, policies, and resources • Administrator • API/mechanism to consume/acquire external data related to policies • GUI • User, • attribute, • resource, • role, • etc. data Data Acquisition AC Model Templates XACML Generate enforceable policies • Verify access control policies Policy Generator Static Verification .xml • Generate • test inputs Generate and evaluate test inputs • Test inputs based • on structural or • combinatorial coverage Test inputs with their evaluated decisions Dynamic Verification • http://www.nist.gov/itl/csd/set/acpt.cfm
ACPT Property specification in ACPT 44
Static Verification Verify the property against Policy A, the result return false with counterexample. 45
Static Verification (cont.) Verify the property against Policy B, the result return true. 46
Conclusion • Systematic policy verification • Property inference [POLICY 06, SSIRI 09, DBSec 10] • Property-quality assessment [ACASC 08] • Properties derived from access control models [POLICY 10DE] • Systematic policy testing • Structural coverage criteria [ICICS 06] • Fault models/mutation testing [WWW 07] • Test generation [SESS 07] • Policy engine performance [SIGMETRICS 08, TC] • Policy engine correctness [TAV-WEB 08] • Firewall policy testing/fixing [SRDS 08/09, LISA 10] • XACML policies • XACML engines • Firewall policies
Questions?https://sites.google.com/site/asergrp/projects/policyQuestions?https://sites.google.com/site/asergrp/projects/policy