590 likes | 695 Views
Symbolic and Computational Analysis of Network Protocol Security. John Mitchell Stanford University. Asian 2006. Outline. Protocols Some examples, some intuition Symbolic analysis of protocol security Models, results, tools Computational analysis
E N D
Symbolic and Computational Analysis of Network Protocol Security John Mitchell Stanford University Asian 2006
Outline • Protocols • Some examples, some intuition • Symbolic analysis of protocol security • Models, results, tools • Computational analysis • Communicating Turing machines, composability • Combining symbolic, computational analysis • Some alternate approaches • Protocol Composition Logic (PCL) • Symbolic and computational semantics
Many Protocols • Authentication • Kerberos • Key Exchange • SSL/TLS handshake, IKE, JFK, IKEv2, • Wireless and mobile computing • Mobile IP, WEP, 802.11i • Electronic commerce • Contract signing, SET, electronic cash, See http://www.lsv.ens-cachan.fr/spore/, http://www.avispa-project.org/library
IPv6 Mobile IPv6 Architecture • Authentication is a requirement • Early proposals weak Mobile Node (MN) Direct connection via binding update Corresponding Node (CN) Home Agent (HA)
EAP/802.1X/RADIUS Authentication Data Communication 802.11i Wireless Authentication Supplicant UnAuth/UnAssoc 802.1X Blocked No Key Supplicant Auth/Assoc 802.1X UnBlocked PTK/GTK 802.11 Association MSK 4-Way Handshake Group Key Handshake
m1 m2 IKE subprotocol from IPSEC A, (ga mod p) B, (gb mod p) ,signB(m1,m2) signA(m1,m2) A B Result: A and B share secret gab mod p Analysis involves probability, modular exponentiation, complexity, digital signatures, communication networks
Initiate Respond Attacker C D Run of a protocol B A Correct if no security violation in any run
Protocol analysis methods • Cryptographic reductions • Bellare-Rogaway, Shoup, many others • UC [Canetti et al], Simulatability [BPW] • Prob poly-time process calculus [LMRST…] • Symbolic methods (see also http://www.avispa-project.org/) • Model checking • FDR [Lowe, Roscoe, …], Murphi [M, Shmatikov, …], • Symbolic search • NRL protocol analyzer [Meadows] • Theorem proving • Isabelle [Paulson …], Specialized logics [BAN, …]
“The” Symbolic Model • Messages are algebraic expressions • Nonce, Encrypt(K,M), Sign(K,M), … • Adversary • Nondeterministic • Observe, store, direct all communication • Break messages into parts • Encrypt, decrypt, sign only if it has the key • Example: K1, Encrypt(K1, “hi”) K1, Encrypt(K1, “hi”) “hi” • Send messages derivable from stored parts
Many formulations • Word problems [Dolev-Yao, Dolev-Even-Karp, …] • Each protocol step is symbolic function from input message to output message; cancellation law dkekx = x • Rewrite systems [CDLMS, …] • Each protocol step is symbolic function from state and input message to state and output message • Logic programming [Meadows NRL Analyzer] • Each protocol step can be defined by logical clauses • Resolution used to perform reachability search • Constraint solving [Amadio-Lugiez, … ] • Write set constraints defining messages known at step i • Strand space model [MITRE] • Partial order (Lamport causality), reasoning methods • Process calculus [CSP, Spi-calculus, applied , …) • Each protocol step is process that reads, writes on channel • Spi-calculus: use for new values, private channels, simulate crypto
Complexity results (see [Cortier et al]) Additional results for variants of basic model (AC, xor, modular exp, …)
Many protocol case studies • Murphi [Shmatikov, He, …] • SSL, Contract signing, 802.11i, … • Meadows NRL tool • Participation in IETF, IEEE standards • Many important examples • Paulson inductive method; Scedrov et al • Kerberos, SSL, SET, many more • Protocol logic • BAN logic and successors (GNY, SvO, …) • DDMP … Automated tools based on the symbolic model detect important, nontrivial bugs in practical, deployed, and standardized protocols
Computational model I “Alice” “Bob” oracle tape oracle tape Adversary input tape work tape [Bellare-Rogaway, Shoup, …]
Computational model II Turing machine Turing machine Adversary Turing machine Turing machine [Canetti, …]
Computational model III Program Program Adversary In(c, x).Send(…) | In(d,y).new z. Send(…y z ..) | In(c, encrypt(k,…)). … program Program [Micciancio-Warinschi, …]
Computational security: encryption • Several standard conditions on encryption • Passive adversary • Semantic security • Chosen ciphertext attacks (CCA1) • Adversary can ask for decryption before receiving a challenge ciphertext • Chosen ciphertext attacks (CCA2) • Adversary can ask for decryption before and after receiving a challenge ciphertext • Computational model offers more choices than the symbolic model
m0, m1 E(mi) guess 0 or 1 Passive Adversary Challenger Attacker
c D(c) m0, m1 E(mi) guess 0 or 1 Chosen ciphertext CCA1 Challenger Attacker
c D(c) m0, m1 E(mi) c E(mj) D(c) guess 0 or 1 Chosen ciphertext CCA2 Challenger Attacker
Z input input P2 P1 S A P4 attacker P3 simulator F output Ideal functionality output Z Slide: R Canetti Protocol execution Equivalence-based methods: UC, RSIM P2 P1 P4 P3
Some relevant approaches • Simulation framework • Backes, Pfitzmann, Waidner • Correspondence theorems • Micciancio, Warinschi • Kapron-Impagliazzo logics • Abadi-Rogaway passive equivalence (K2,{01}K3) , {({101}K2,K5 )}K2, {{K6}K4}K5 (K2, ) , {({101}K2,K5 )}K2, { }K5 (K1, ) , {({101}K1,K5 )}K1, { }K5 (K1,{K1}K7) , {({101}K1,K5 )}K1, {{K6}K7}K5 Proposed as start of larger plan for computational soundness … … [Abadi-Rogaway00, …, Adao-Bana-Scedrov05]
Symbolic methods comp’l results • Pereira and Quisquater, CSFW 2001, 2004 • Studied authenticated group Diffie-Hellman protocols • Found symbolic attack in Cliques SA-GDH.2 protocol • Proved no protocol of certain type is secure, for >3 participants • Micciancio and Panjwani, EUROCRYPT 2004 • Lower bound for class of group key establishment protocols using purely Dolev-Yao reasoning • Model pseudo-random generators, encryption symbolically • Lower bounds is tight; matches a known protocol
Rest of talk: Protocol composition logic Protocol Honest Principals, Attacker • Alice’s information • Protocol • Private data • Sends and receives Private Data Send Receive Logic has symbolic and computational semantics
Example { A, Noncea } { Noncea, … } Kb A B Ka • Alice assumes that only Bob has Kb-1 • Alice generated Noncea and knows that some X decrypted first message • Since only X knows Kb-1, Alice knows X=Bob
More subtle example: Bob’s view { A, Noncea } { Noncea, B, Nonceb } { Nonceb} Kb A B Ka Kb • Bob assumes that Alice follows protocol • Since Alice responds to second message, Alice must have sent the first message
Execution model • Protocol • “Program” for each protocol role • Initial configuration • Set of principals and key • Assignment of 1 role to each principal • Run Position in run send{x}B new x A recv{z}B recv{x}B decr B send{z}B new z C
Formulas true at a position in run • Action formulas a ::= Send(P,m) | Receive (P,m) | New(P,t) | Decrypt (P,t) | Verify (P,t) • Formulas ::= a | Has(P,t) | Fresh(P,t) | Honest(N) | Contains(t1, t2) | | 1 2 | x | | • Example a < b = (b a) Notation in papers varies slightly …
Modal Formulas • After actions, condition [ actions ] P where P = princ, role id • Before/after assertions [ actions ] P • Composition rule [ S ] P [ T ] P [ ST ] P Logic formulated: [DMP,DDMP] Related to: BAN, Floyd-Hoare, CSP/CCS, temporal logic, NPATRL
msg1 msg3 Example: Bob’s view of NSL • Bob knows he’s talking to Alice [ receive encrypt( Key(B), A,m ); new n; send encrypt( Key(A), m, B, n ); receive encrypt( Key(B), n ) ] B Honest(A) Csent(A, msg1) Csent(A, msg3) where Csent(A, …) Created(A, …) Sent(A, …)
Proof System • Sample Axioms: • Reasoning about possession: • [receive m ]A Has(A,m) • Has(A, {m,n}) Has(A, m) Has(A, n) • Reasoning about crypto primitives: • Honest(X) Decrypt(Y, enc(X, {m})) X=Y • Honest(X) Verify(Y, sig(X, {m})) m’ (Send(X, m’) Contains(m’, sig(X, {m})) • Soundness Theorem: • Every provable formula is valid in symbolic model
Modal Formulas • After actions, condition [ actions ] P where P = princ, role id • Before/after assertions [ actions ] P • Composition rule [ S ] P [ T ] P [ ST ] P
Shared secret (with someone) A deduces: Knows(Y, gab) (Y = A) ۷ Knows(Y,b) Authenticated Composition example: Part 1 Diffie Hellman A B: ga B A: gb
Shared secret Authenticated A deduces: Received (B, msg1) Λ Sent (B, msg2) Composition example: Part 2 Challenge-Response A B: m, A B A: n, sigB {m, n, A} A B: sigA {m, n, B}
Shared secret: gab Authenticated Composition: Part 3 m := ga n := gb ISO-9798-3 A B: ga, A B A: gb, sigB {ga, gb, A} A B: sigA {ga, gb, B}
Additional issues • Reasoning about honest principals • Invariance rule, called “honesty rule” • Preserve invariants under composition • If we prove Honest(X) for protocol 1 and compose with protocol 2, is formula still true?
More about composing protocols ’ DH Honest(X) … CR Honest(X) … ’ |- Authentication |- Secrecy ’ |- Secrecy ’ |- Authentication ’ |- Secrecy Authentication [additive] DH CR’[nondestructive] = ISOSecrecy Authentication
PCL Computational PCL • PCL • Syntax • Proof System • Computational PCL • Syntax ± • Proof System ± • Symbolic model • Semantics • Complexity-theoretic model • Semantics
Some general issues • Computational PCL • Symbolic logic for proving security properties of network protocols using public-key encryption • Soundness Theorem: • If a property is provable in CPCL, then property holds in computational model with overwhelming asymptotic probability. • Benefits • Retain compositionality • Symbolic proofs about computational model • Computational reasoning in soundness proof (only!) • Different axioms rely on different crypto assumptions • symbolic computational generally uses strong crypto assumptions
PCL Computational PCL • Syntax, proof rules mostly the same • Retain compositional approach • But some issues with propositional connectives… • Significant differences • Symbolic “knowledge” • Has(X,t) : X can produce t from msgs that have been observed, by symbolic algorithm • Computational “knowledge” • Possess(X,t) : can produce t by ppt algorithm • Indist(X,t) : cannot distinguish from rand value in ppt • More subtle system • Some axioms rely on CCA2, some info-theoretically sound, etc.
Computational Traces • Computational trace contains • Symbolic actions of honest parties • Mapping of symbolic variables to bitstrings • Send-receive actions (only) of the adversary • Runs of the protocol • Set of all possible traces • Each tagged with random bits used to generate trace • Tagging set of equi-probable traces
Complexity-theoretic semantics • Given protocol Q, adversary A, security parameter n, define • T=T(Q,A,n), set of all possible traces • [[]](T) a subset of T that respects in a specific way • Intuition: valid when [[]](T) is an asymptotically overwhelming subset of T
Semantics of trace properties • Defined in a straight forward way [[Send(X, m)]](T) All traces t such that • t contains a Send(msg) action by X • the bistring value of msg is the bitstring value of m
Inductive Semantics • [[1 2]] (T) = [[1]] (T) [[2]] (T) • [[1 2]] (T) = [[1]] (T) [[2]] (T) • [[ ]] (T) = T - [[]] (T) Implication uses a form of conditional probability • [[1 2]] (T) = [[1]] (T) [[2]] (T’) where T’ = [[1]] (T)
Protocol Attacker m View(X) if b then m else rand C D b’ Semantics of Indistinguishable • Not a trace property • Intuition: Indist(X, m) holds if no algorithm can distinguish m from a random value, given X’s view of the run [[Indist(X, m)]] (T, D, e) = T if | #(t: b=b’)-|T|/2 | < e
Validity of a formula Q |= if adversary A distinguisher D negligible function f n0 s.t. n > n0 |[[]](T,D,f(n))| / |T| > 1 – f(n) Fraction of traces where “ is true” • Fix protocol Q, PPT adversary A • Choose value of security parameter n • Vary random bits used by all programs • Obtain set T=T(Q,A,n) of equi-probable traces T(Q,A,n) [[]](T,D,f)
Advantages of Computational PCL • High-level reasoning, sound for “real crypto” • Prove properties of protocols without explicit reasoning about probability, asymptotic complexity • Composability • PCL is designed for protocol composition • Composition of individual steps • Not just coarser composition available with UC/RSIM • Can identify crypto assumptions needed • ISO-9798-3 [DDMW2006] • Kerberos V5 [unpublished] Thesis: existing deployed protocols have weak security properties, assuming weak security properties of primitives they use; UC/RSIM may be too strong
CPCL analysis of Kerberos V5 • Kerberos has a staged architecture • First stage generates a nonce and sends it encrypted • Second stage uses nonce as key to encrypt another nonce • Third stage uses second-stage nonce to encrypt other msgs • Secrecy • Logic proves “GoodKey” property of both nonces • Authentication • Proved assuming encryption provides ciphertext integrity • Modular proofs using composition theorems
Challenges for computational reasoning • More complicated adversary • Actions of computational adversary do not have a simple inductive characterization • More complicated messages • Computational messages are arbitrary sequences of bits, without an inductively defined syntactic structure • Different scheduler • Simpler “non-preemptive” scheduling is typically used in computational models (change symbolic model for equiv) • Power of induction ? • Indistinguishability, other non-trace-based properties appear unsuitable as inductive hypotheses • Solution: prove trace property inductively and derive secrecy
Current and Future Work • Investigate nature of propositional fragment • Non-classical implication related to conditional probability • complexity-theoretic reductions • connections with probabilistic logics (e.g. Nilsson86) • Generalize reasoning about secrecy • Work in progress, thanks to Arnab • Need to incorporate insight of “Rackoff’s attack” • Extend logic • More primitives: signature, hash functions,… • Complete case studies • Produce correctness proofs for all widely deployed standards • Collaborate on • Foundational work – please join us ! • Implementation and case studies – please help us !