1 / 21

2. Noninterference 4 Sept 2002

2. Noninterference 4 Sept 2002. Information-Flow. Security properties based on information flow describe end-to-end behavior of system Access control: “This file is readable only by processes I have granted authority to.”

ltate
Download Presentation

2. Noninterference 4 Sept 2002

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2. Noninterference 4 Sept 2002

  2. Information-Flow • Security properties based on information flow describe end-to-end behavior of system • Access control: “This file is readable only by processes I have granted authority to.” • Information-flow control: “The information in this file may be released only to appropriate output channels, no matter how much intervening computation manipulates it.”

  3. Noninterference • [Goguen & Meseguer 1982, 1984] • Low output of a system is unaffected by high input L H1 L H2 L L H1 L H2

  4. Security properties • Confidentiality: is information secret? L = public, H = confidential • Integrity: is information trustworthy?L = trusted, H = untrusted • Partial order: L  H,H  L, information canonly flow upward inorder • Channels: ways forinputs to influenceoutputs L H1 L H1

  5. Formalization • No agreement on how to formalize in general • GM84 (simplified): system is defined by a transition function do : S×E  S and low output function out: S  O (what the low user can see) • S is the set of system states • E is the set of events (inputs) : either high or low • Trace t is sequence of state-event pairs((s0,e0),(s1,e1), …) where si+1 = do(si, ei) • Noninterference: for all event histories (e0,…,en) that differ only in high events, out(sn) is the same where sn is the final state of the corresponding traces • Alternatively: out(sn) defined by results from a purged event history

  6. Example 2 h2 h1 l 2 3 2 l l hx 3 3 hx hx • Visible output from input sequences (l), (h1,l), (h2,l) is 3 • Visible output from input sequences (), (h1), (h2) is 2 • Low part of input determines visible results

  7. Limitations • Doesn’t deal with all transition functions • partial (e.g., nontermination) • nondeterministic (e.g., concurrency) • sequential input, output assumption

  8. A generalization • Key idea: behaviors of the system C should not reveal more information than the low inputs • Consider applying C to inputs s • Define: Cs is the result of C applied to s (“do”) s1=Ls2 means inputs s1 and s2 are indistinguishable to the low user (same “purge”) Cs1LCs2 means results are indistinguishable : the low view relation (same “out”) • Noninterference for C:s1=Ls2  Cs1LCs2“Low observer doesn’t learn anything new”

  9. Unwinding condition • Induction hypothesis for proving noninterference • Assume C, L defined using traces l h s1 s1 s1 s1 =L =L =L (s1=L s1) =L (s1=L s1) l s2 s2 s2 • By induction: traces differing only in high steps, starting from equivalent states, preserve equivalence • =L must be an equivalence—need transitivity

  10. Example • “System” is a program with a memory if h1 then h2:= 0 else h2:= 1; l := 1 • s = c, m • c1,m1 =Lc2, m2 if identical after: • erasing high terms from ci • erasing high memory locations from mi • Choice of =L controls what low observer can see at a moment in time • Current command c included in state to allow proof by induction

  11. Example if h1 then h2 := 0 else h2 := 1; l := 1,{h10, h21, l0} =L if h1 then h2 := 0 else h2 := 1; l := 1, {h11, h21, l0} h2 := 1; l := 1, {h10, h21, l0} =L h2 := 0; l := 1, {h11, h21, l0} =L l := 1, {h10, h21, l0} l := 1, {h11, h20, l0} =L {h10, h21, l1} {h11, h20, l1}

  12. Nontermination Is this program secure? while h > 0 do h := h+1;l := 1 • {h0, l0} * {h0, l1} • {h1, l0} * {hi, l0} (i>0) • Low observer learns value of h by observing nontermination, change to l • But… might want to ignore this channel to make analysis feasible

  13. Equivalence classes • Equivalence relation =L generates equivalence classes of states indistinguishable to attacker [s]L = { s | s =L s } • Noninterference  transitions act uniformly on each equivalence class • Given trace t = (s1, s2, …), low observer sees at most ([s1]L, [s2]L, …)

  14. Low views • Low view relation L on traces modulo =L determines ability of attacker to observe system execution • Termination-sensitive but no ability to see intermediate states:(s1, s2,…,sm) L (s1, s2,…sn) if sm=L sn& all infinite traces are related by L • Termination-insensitive:(s1, s2,…,sm) L (s1, s2,…sn) if sm=L sn& infinite traces are related by L to all traces • Timing-sensitive:(s1, s2,…,sn) L (s1, s2,…sn) if sn=L sn& all infinite traces are related by L • Not always an equivalence relation!

  15. Nondeterminism • Two sources of nondeterminism: • Input nondeterminism • Internal nondeterminism • GM assume no internal nondeterminism • Concurrent systems are nondeterministic s2s2 s1 | s2s1 | s2 s1s1 s1 | s2s1 | s2 • Noninterference for nondeterministic systems?s1, s2 . s1=Ls2  Cs1LCs2

  16. Possibilistic security • [Sutherland 1986, McCullough 1987] • Result of a system Cs is set of possible outcomes (traces) • Low view relation on traces is lifted to sets of traces: Cs1LCs2 if t1Cs1 . t2Cs2. t1Lt2 & t2Cs2 . t1Cs1. t2Lt1 “For any trace produced by C1 there is an indistinguishable one produced by C2 (and vice-versa)”

  17. Proving possibilistic security • Almost the same induction hypothesis: l h s1 s1 s1 s1 =L =L =L (s1=L s1) =L (s1=L s1) l s2 s2 s2 • Show that there is a transition that preserves state equivalence (for termination-insensitive security)

  18. Example l := true | l := false | l := h h=true: possible results are {htrue, lfalse}, {htrue, ltrue} h = false:{hfalse, lfalse}, {hfalse, ltrue} • Program is possibilistically secure =L =L

  19. What is wrong? l := true | l := false | l := h • Round-robin scheduler: program equiv. to l:=h • Random scheduler: h most probable value of l • System has a refinement with information leak l:=h l:=true l:=false

  20. Refinement attacks • Implementations of an abstraction generally refine (at least probabilistically) transitions allows by the abstraction • Attacker may exploit knowledge of implementation to learn confidential info. l := true | l := false • Is this program secure?

  21. Determinism-based security • Require that system is deterministic from the low viewpoint [Roscoe95] • High information cannot affect low output – no nondeterminism to refine • Another way to generalize noninterference to nondeterministic systems : don’t change definition! s1, s2 . s1=Ls2  Cs1LCs2 • Nondeterminism may be present, but not observable • More restrictive than possibilistic security

More Related