290 likes | 383 Views
Foundational Certified Code in a Metalogical Framework. Karl Crary and Susmit Sarkar Carnegie Mellon University. Motivation: Grid Computing. Make use of idle computing cycles over the network [e.g. SETI] Computer owners download and execute code from developers
E N D
Foundational Certified Code in a Metalogical Framework Karl Crary and Susmit Sarkar Carnegie Mellon University
Motivation: Grid Computing • Make use of idle computing cycles over the network [e.g. SETI] • Computer owners download and execute code from developers • A key issue: Unknown developers, so consumers are concerned about safety
Certified code Code is safe! • Package the code with certificate [PCC, TAL] • Certificate: a machine verifiable proof of safety • Typically, proof that code is well-typed in a safe type system Is code safe ? Knowledge Certificate Developer Consumer Code
Type System? Is that safe? • Old Answer: Fix a type system, trust peer-review • New Answer: Give developers flexibility of using their own type systems • Need to check this is safe • Known as Foundational Certified Code Type System Type System Machine details Certificate Developer Consumer Code
Roadmap • Our system • Metalogics • Safety Policy • A Safety Proof • Related and future work
Our System Safety Condition Safety Policy Safety Proof Why is your safety condition any good? Does Code satisfy the Safety Policy? Code satisfies my Safety Condition I can prove it to you! Certificate Developer Code Consumer
Metalogic : meta theorems • We use LF to express logics • e.g., operational semantics • producer’s safety conditions • We care about meta theorems: • If some input derivation exists, then an output derivation exists • e.g., Safety Theorem
How to check meta theorems? • Choice 1: reflect metalogical reasoning in the framework • Choice 2: use a logic designed for metalogical reasoning • e.g. Twelf [Schurmann]
Programming in Meta logics • We write logic programs relating derivations • limited to -1 reasoning, authors plan stronger system • Need to do induction on structure of derivation • System can check these logic programs are total (user annotations required)
Roadmap • Our system • Metalogics • Safety policy • A safety proof • Related and future work
Safety policy - Preliminaries • Formalize operational semantics of the IA32 architecture • Formalize machine states: memory, register files, stack, instruction pointer • Formalize transitions from state to state • Remove transitions deemed unsafe
Example: transition for addition • addl $5,(%eax) • load 4 bytes from (%eax), • load immediate operand 5, • add them, • store result back in (%eax), • update EFLAGS and advance EIP • This can go wrong, e.g. if eax points to protected memory • Solution: The formal load and store relations do not apply in such cases
Safety Policy • Define initial state on loading program P • We never get to a state where the (formal) machine does not have a transition • Another way of stating: the formal machine is never stuck • Halt state treated specially
Why is this safe? • Real machine’s transitions according to formal machine’s transitions: real machine is performing safe operations • To perform unsafe operations, real machine takes a transition not in formal machine • This does not happen in a safe machine
Roadmap • Our system • Metalogics • Safety policy • A safety proof • Related and future work
Example Safety Proof • A particular safety proof • Our safety proof is for TALT [Crary] • Type system for an assembly language • Fairly low-level, but still abstract • Our foundational safety proof is syntactic [Hamid et al.]
Safety • Our conditions will isolate a set of safe states • Safe states cannot transition to stuck states Safe State M1 State M2
Key Lemmas • Progress • Preservation Safe State M1 State M2 Safe State M2 Safe State M1
Putting it together – Safety Theorem • Transitions from a safe state cannot go to a stuck state Safe Safe State M1 State M2
Idea of proof • Safe machine • Three parts of the proof • Abstract Type Safety (previous work) • Simulation • Determinism Typed abstract M’ implements Safe State M
TALT safety proof [Crary] • This has two top level lemmas: • Progress: A well typed abstract machine makes a transition • Preservation: If a well typed abstract machine makes a transition, the resulting (abstract) machine is well typed
Concrete Machine Lemmas • Simulation • Determinism Abstract M1 Abstract M2 Concrete M1’ Concrete M2’ Concrete M2 Concrete M1 Concrete M2’
Progress progress Abstract M2’ Abstract, typed M1’ implements implements Safe State M1 State M2 State M2
Preservation progress Typed Abstract M2’ Typed abstract M1’ implements implements implements M2+ Safe State M1 Safe Safe State M2
Implementation Statistics • Safety Policy : 2,081 lines of code • Safety Proof : 44,827 lines of code • Time to check : 75 sec • Number of lemmas : 1,466 • Man years : 1 and 1/2
Related work • Foundational PCC - Appel et al • FTAL - Hamid et al • Temporal Logic PCC - Bernard and Lee
Future Work • Develop a compiler from Standard ML to TALT • Expand the target language to include many more IA32 instructions • Specify and prove other properties, e.g. Running time bounds
Indeterminism • The data may be indeterminate, due to e.g. input • Safety demands that any instance be safe • We have an oracle that the semantics consults to determine what to do • Oracle is quantified in safety theorem