1 / 31

Security and Refinement

Explore the concepts of security and refinement in computer systems, including topics like Bell and La Padula model, covert channels, non-interference, and formal methods. Learn how to break and refine models to enhance security.

blakney
Download Presentation

Security and Refinement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Security and Refinement John A. Clark Susan Stepney Howard Chivers Dept of Computer Science, University of York 7 September 2004

  2. What is Security?????

  3. What is security • Stopping people reading what they shouldn’t. • You could be forgiven for believing this. • It’s a start

  4. Bell and La Padula Maximum Clearance Top Secret May write Secret Secret Current Clearance Confidential May read and write Confidential Restricted Restricted May read Unclassified Unclassified

  5. Bell and La Padula • Various foundational papers in the early seventies and the ‘Secure Exposition and MULTICS Interpretation’ in 1976. • But it’s not without its problems....

  6. Bell and La Padula • In practice objects lie outside the model; access is not policed. • file locks • Names • This allows these channels to act a information conduits for information transfer (actually this is their intended function)

  7. Bell and La Padula • In practice objects lie outside the model; access is not policed. • file locks • Names • This allows these channels to act a information conduits for information transfer (actually this is their intended function)

  8. Covert Channels • Computer systems provide channels for inter-subject communication, e.g. FILES. • These are intended to be used for communication and access to them can be policed by the system. • Generally possible to identify unusual means of communicating via elements of the system so that the intent of the mandatory policy is broken. These means are known as covert channels.

  9. Covert Channels • Covert channels arise because subjects shareresources. The shared resources allow one subject (the transmitter) to modulate some aspect of how another subject (the receiver) perceives the system. • Covert channels are generally grouped for working purposes into: • storage channels (where the values returned to the receiver by operations are affected). • timing channels (where the times at which events are perceived by the receiver are modulated).

  10. System Z – the trouble continues • Start in a secure state “no one is able to read a document above their clearance” • Make only transitions that preserve that property • Security is maintained????? • Not exactly. • A system could downgrade everything to the lowest classification and allow anyone access

  11. Non-interference • More modern 1984(1982)++ is non-interference by Goguen and Meseguer. • Actions by high level subjects (e.g. Top Secret) should not be visible to lower level subjects • Formulated in terms of trace validity High Low

  12. Non-interference • This is generally considered too strong a policy to implement effectively. • In practice, ruling out every bit of information flow is not the aim. • We may wish to allow communication but only via specific channels (conditional non-interference)

  13. Formal Methods Empowerment • Model based specifications: operations are “atomic” yet consume no power. • Mathematics: • f(x) just is (cf. “Be man” from the 1960s) • Computation: • You need to calculate f(x) on real hardware. This consumes power.

  14. Power is a drug – Just say no. • 2*5=?? • 25*25=?? • 16554*54237=?? The amount and pattern of intellectual work you have to do varies according to the data you are multiplying. So if you don’t want anyone to know (gain information on) what you are multiplying don’t freely accept offers of power (power is a drug, and it causes you to spill the beans)

  15. Time to think, I think • 216=?? • 2317=?? The time a computer takes to carry out an operation such as exponentiation typically depends on the operands. If the exponent is a private key (e.g. as in RSA)

  16. And is this what security is about? • No. • Confidentiality is one part of security. • Integrity (if we know what this means)??? • Availability??? • Anonymity (again unlikely to be an all or nothing affair). • Accountability/non-repudiation

  17. Breaking the Model Slides by Susan

  18. Relational model of refinement simulation initialisation operation finalisation

  19. what is finalisation? • process of moving from (abstract or concrete) world of programming variables, to “real” (but still modelled) world • how to interpret programming variables • “finalisation glasses” • both outputs and state components • abstract model -- usually identity function

  20. example: set and sequence • global “real world” model : a set • abstract model : a set • s : P X ; FA(s) = s -- identity finalisation • concrete model : a sequence • t : seq X ; FC(t) = ran t • look at the sequence through finalisation glasses that see only the underlying set • successive refinements • one step’s concrete model is the next step’s abstract model

  21. example : a clock paradox (A) clock that is 5 minutes slow -- never right (B) stopped clock -- right twice a day (A) a simple finalisation to get the right time -- add 5 minutes (B) no possible finalisation glasses

  22. unwanted finalisations • finalisations “throw away” information • example: order of the sequence • what if you don’t wear the glasses? • instead, use the maximal identity finalisation, to see “more” • covert channels • ordering, timing, delays, …

  23. G f id G’ C x R a solution? • restrict to identity finalisation • essentially, don’t allow output refinement • but, real devices output concrete things, like bit-streams, or voltages, or … • “extra-logical” finalisations

  24. so, not a solution • forbidding the problem merely moves it somewhere even more covert • instead, use the concept of finalisation to categorise and expose various covert channels, various security attacks

  25. vary the “glasses” • intended -- use the glasses • unintended -- remove (or de-mist) glasses • view page faults, interrupts, power, timing, … • direct -- observe a single output • enhanced -- post-process • view sequences of outputs • single viewpoint • multiple viewpoints (multiple different glasses) • passive -- just observe • invasive -- break into the device

  26. vary the system • standard -- observe the system working as intended • perturbed -- observe a perturbed system • under load, irradiated, flexed, … • passive perturbation -- worn out device • invasive perturbation -- attack with ion gun • single system • multiple systems -- differential analyses • homogeneous multiples -- lots of the “same” system • heterogeneous -- “sameness” depends on the observation • engineered, natural

  27. vary the environment • standard -- observe within specification (can still vary) • perturbed -- observe outside specification • heated, cooled, … • single environmental attribute • multiple attribute • passive perturbation -- naturally very cold • smart cards in Sweden • cold can lower noise, defeat blinding defences • invasive perturbation -- deliberately modulate power supply

  28. higher order glasses • non-standard observations of the analysis techniques themselves • often attacks involve searches • view the trajectory of a search algorithm • view the search on a slightly different problem • …

  29. illustrations • intended finalisation, single system, perturbed • fault injection, bit flip, breaking RSA • unintended finalisation, enhanced, single system • differential power analysis of smart cards • unintended finalisation, single system, perturbed environment • static RAM persists for minutes at –20°C

  30. prevention • enforce use of the glasses • by use of a protective screen • implement “magic eye” glasses • noise without them -- crypto • detect unwanted finalisations • mainly invasive ones • and then destroy the secrets!

  31. unusual, wanted, finalisations • multiple finalisations for different roles • user, administrator, audit, … • breaking atomicity usefully • progressive gifs • finalise the user • observe dynamics of hand signatures • textual analyses for authorship • anonymity v. authentication tradeoffs • destructive quantum finalisation • essential in various security schemes

More Related