260 likes | 367 Views
A Type System for Expressive Security Policies. David Walker Cornell University. Extensible Systems. System Interface. Code. Extensible systems are everywhere: web browsers, extensible operating systems, servers and databases Critical Problem: Security. Download, Link & Execute.
E N D
A Type System forExpressive Security Policies David Walker Cornell University
Extensible Systems System Interface Code • Extensible systems are everywhere: • web browsers, extensible operating systems, servers and databases • Critical Problem: Security Download, Link & Execute David Walker, Cornell University
Certified Code Untrusted Code Secure Code System Interface • Attach annotations (types, proofs, ...) to untrusted code • Annotations make verification of security properties feasible Download & Verify Link & Execute Annotations David Walker, Cornell University
Certifying Compilation • An Advantage • Increased Trustworthiness • verification occurs after compilation • compiler bugs will not result in security holes • A Disadvantage • Certificates may be difficult to produce David Walker, Cornell University
Producing Certified Code High-level Program • Certificate production must be automated • Necessary components: 1) a source-level programming language 2) a compiler to compile, annotate, and optimize source programs 3) a transmission language that certifies security properties Compile user annotations Annotated Program Optimize Transmit David Walker, Cornell University
So Far ... Type Safe High-level Program 1) a strongly typed source-level programming language 2) a type-preserving compiler to compile, annotate, and optimize source programs 3) a transmission language that certifies type-safety properties Compile types Typed Program Optimize Transmit David Walker, Cornell University
Examples • Proof-Carrying Code[Necula & Lee] • compilers produce type safety proofs • Typed Assembly Language[Morrisett, Walker, et al] • guarantees type safety properties • Efficient Code Certification[Kozen] • uses typing information to guarantee control-flow and memory safety properties • Proof-Carrying Code [Appel & Felty] • construct types from low-level primitives David Walker, Cornell University
Conventional Type Safety • Conventional types ensure basic safety: • basic operations performed correctly • abstraction/interfaces hide data representations and system code • Conventional types don't describe complex policies • eg: policies that depend upon history • Melissa virus reads Outlook contacts list and then sends 50 emails David Walker, Cornell University
Security in Practice • Security via code instrumentation • insert security state and check dynamically • use static analysis to minimize run-time overhead • SFI [Wahbe et al], • SASI [Erlingsson & Schneider], • Naccio [Evans & Twyman], • [Colcombet & Fradet], … David Walker, Cornell University
This Paper • Combines two ideas: • certifying compilation • security via code instrumentation • The Result: • a system for secure certified code • high-level security policy specifications • an automatic translation into low-level code • security enforced by static & dynamic checking David Walker, Cornell University
Strategy • Security Automata specify security properties [Erlingsson & Schneider] • Compilation inserts typing annotations & dynamic checks where necessary • A dependently-typed target language provides a framework for verification • can express & enforce any security automaton policy • provably sound David Walker, Cornell University
Security Architecture High-level Program Security Automaton System Interface Compile Annotate Secure Typed Program Secure Typed Interface Optimize Secure Executable Type Check Transmit David Walker, Cornell University
Security Automata • A general mechanism for specifying security policies • Enforce any safety property • access control policies: • “cannot access file foo” • resource bound policies: • “allocate no more than 1M of memory” • the Melissa policy: • “no network send after file read” David Walker, Cornell University
Example read(f) • Policy: No send operation after a read operation • States: start, has read, bad • Inputs (program operations): send, read • Transitions (state x input -> state): • start x read(f) -> has read start has read send read(f) bad send David Walker, Cornell University
Example Cont’d read(f) • S.A. monitor program execution • Entering the bad state = security violation start has read send read(f) bad send % untrusted program % s.a.: start state send(); % ok -> start read(f); % ok -> has read send(); % bad, security violation David Walker, Cornell University
Every security-relevant operation has an associated function: checkop Trusted, provided by policy writer checkop implements the s.a. transition function Enforcing S.A. Specs checksend (state) = if state = start then start else bad David Walker, Cornell University
Enforcing S.A. Specs • Rewrite programs: send() let next_state = checksend(current_state) in if next_state = bad then halt else % next state is ok send() David Walker, Cornell University
Questions • How do we verify instrumented code? • is this safe? let next_state = checksend(other_state) in if next_state = bad then halt else % next state is ok send() • Can we optimize certified code? David Walker, Cornell University
Verification • Basic types ensure standard type safety • functions and data used as intended and cannot be confused • security checks can’t be circumvented • Introduce a logic into the type system to express complex invariants • Use the logic to encode the s.a. policy • Use the logic to prove checks unnecessary David Walker, Cornell University
Target Language Types • Predicates: • describe security states • describe automaton transitions • describe dependencies between values • Function types include predicates so they can specify preconditions: • foo: [1,2,P1(1,2),P2(1)] . 1 -> 2 David Walker, Cornell University
Each security-relevant function has a type specifying 3 additional preconditions eg: the send function: P1: in_state(current_state) P2: transitionsend(current_state,next_state) P3: next_state bad Pre: P1 & P2& P3 Post: in_state(next_state) The precondition ensures calling send won’t result in a security violation Secure Functions David Walker, Cornell University
Dynamic checks propagate information into the type system eg: checksend(state) Post: next_state. transitionsend(state,next_state) & result = next_state conditional tests: if state = bad then % assume state = bad ... else % assume state bad ... Run-time Security Checks David Walker, Cornell University
Example % P1: in_state(current_state) let next_state = check_send(current_state) in % P2: transitionsend(current_state,next_state) if next_state = bad then halt else % P3: next_state bad send() % P1 & P2 & P3 imply send is ok David Walker, Cornell University
read(f) start has read send read(f) bad send Optimization • Analysis of s.a. structure makes redundant check elimination possible • eg: • supply the type checker with the fact transitionsend(start,start) and verify: if current = start then send(); send (); send (); … David Walker, Cornell University
Related Work • Program verification • abstract interpretation, data flow & control flow analysis, model checking, soft typing, verification condition generation & theorem proving, ... • Dependent types in compiler ILs • Xi & Pfenning, Crary & Weirich, ... • Security properties of typed languages • Leroy & Rouaix, ... David Walker, Cornell University
Summary • A recipe for secure certified code: • types • ensure basic safety • prevent dynamic checks from being circumvented • provide a framework for reasoning about programs • security automata • specify expressive policies • dynamic checking when policies can’t be proven statically David Walker, Cornell University