1 / 12

Deriving formal specifications (almost) automatically

Deriving formal specifications (almost) automatically. Glenn Ammons and Ras Bodik and James R. Larus. Three pillars of formal verification. Model checkers and other verifiers well automated (SLAM, Spin, type checkers, Vault) Program abstractors getting there (SLAM, Engler’s metacompiler)

blaze
Download Presentation

Deriving formal specifications (almost) automatically

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Deriving formal specifications (almost) automatically Glenn Ammons and Ras Bodik and James R. Larus

  2. Three pillars of formal verification • Model checkers and other verifiers • well automated (SLAM, Spin, type checkers, Vault) • Program abstractors • getting there (SLAM, Engler’s metacompiler) • Formal specifications • Written by hand • Our goal: bring automation to writing formal specifications

  3. Deriving specs is feasible • Well-debugged software exists • Good code obeys the rules, but doesn’t state them clearly • Common behavior is good behavior • Because testing exposes common behavior • Programmers exist • But they don’t want to write specs!

  4. Rules describe good behavior A rule is a nondeterministic finite automaton: T = XNextEvent XSetSelectionOwner(T) S F XSetSelectionOwner(T) XGetSelectionOwner XGetSelectionOwner

  5. Rules are derived from traces, with user guidance XtAppNextEvent() = event(type = 5, window = 22, time = 3:15) XtDispatchEvent(type = 5, window = 22, time = 3:15) XtFree(NULL) XtFree(NULL) XtMalloc(size = 8) = 0x10 XmuInternStrings(names = 0x20, count = 2, atoms_return = 0x10) XtOwnSelection(widget = 0x30, selection = 1, time = 3:15) And so on: the more traces the better

  6. Overview Traces Seeds Programs or traces (buggy?) Abstraction prescription Rule learner Program abstractor Rules Abstract programs or traces Matcher Bugs!

  7. Case study: selections in X11 • The rule: SetSelectionOwner must be passed a timestamp from an Xevent • 25 programs from the X11 distribution and the contrib directories (all used selections) • Verification done over traces (not statically) • Found two bugs in 29 static uses • Found three benign violations

  8. To do • Static checking: typestates • Better simplifier • Better user interaction • What else can we learn? • Protocols like socket/bind/accept/close • Operations on data structures

  9. Power What else can we do with this stuff? Compare with Ernst

  10. Detailed figure?

  11. Running example

  12. Examines the complete programs Examines some inputs For better coverage, write more test cases Examines only some aspects of programs Examines all inputs For better coverage, write more specs Testing vs. verification The practice sees writing test cases as easier than writing formal models and specifications, so testing dominates.

More Related