230 likes | 328 Views
5038/2009: The Electronic Society. Systems Thinking. Systems Thinking (slide from Frank Guerin’s intro lect ). Systems, which perform functions and provide services, are complex assemblies and combinations of technological, human/social, economic, and policy components.
E N D
5038/2009: The Electronic Society Systems Thinking
Systems Thinking(slide from Frank Guerin’s intro lect) • Systems, which perform functions and provide services, are complex assemblies and combinations of technological, human/social, economic, and policy components. • How can we organize our understanding? • How can model systems so that we can explore and reason about all of the interacting and conflicting components and requirements? • How do systems fail? Systemic failure, component failure, individual culpability? • Security examples.
Systems Thinking 1 • What do we mean by systems? • What are the different parts? • People, Process, and Technology (PPT): Socio-economic-technical systems. • Multi-layered systems. • Why do systems fail? • Component failure. • Integration failure. • Unforeseen circumstances.
Distributed Systems • Definition of a Distributed System: • A collection of autonomous information-processing devices connected by a network supporting data transmissions between devices • Managed by software that is designed to support an integrated computing facility that delivers services to users and other systems • Examples: the Internet; your home network; a bank’s account management systems, the Met Office’s network of sensors • So, different levels of abstraction matter
More abstractly … • The system has a boundary between itself and its environment • The system itself consists in • A collection of locations • A collection of resources at each location • A collection of processes that execute at locations using the available resources • The environment is represented stochastically, events begin incident upon the system according to a probability distribution.
A System Model processes manipulate resources L1 events L2 … R1 R2 … events
Example • Boats entering a harbour: • Arrive from the sea (the environment) according to an exponential distribution (simple gives an arrival rate) • Locations: holding area; jetties • Resources: tugs, cranes, stevedores • Process: a boat itself, arrives from sea collects tugs, docks at a jetty, uses a crane, collects tugs, returns to sea • Even has its own iPhone app (Harbour Master)
A Security Example • The use of USB sticks by the employees of a major bank. • USB sticks used for good reasons. • But usage leads to a range of information security vulnerabilities. • How to protect?
USB locations Client’s Office Home Transport Each location has different vulnerabilities, threats, and protection Office
The USB Model • Process: lifecycle of a stick (cf. a boat) • The stick accesses resources at the various locations; e.g., a port on computer (cf. tug) • As the stick moves around the locations, it is subject to different threats. Examples? • Thieves, for example, might be part of the environment. So, model arrival of a thief in the same train carriage of the stick using a probability distribution • Likelihood of data loss depends on things like the probability stick’s owner used its encryption …
How to use the model? • Run simulations to understand the consequences of different design choices: a simulation modelling tool that captures this is Core Gnosis, available from http://www.hpl.hp.com/research/systems_security/gnosis.html • Use logical methods to reason about properties of the system. Don’t worry, this is beyond the scope of this course − involves heavy mathematical logic … .
Example • How can data be lost from USB sticks: • Stick lost on train • Stick corrupted by malware on a home computer • Stick connected to client’s computer, other clients’ files accidently copied • … • Solutions?
Encryption? • Is this a good solution?
Yes, because if sticks are always encrypted, then there is very little risk of date being lost
No, because encryption significantly impedes productivity: • Typing passwords takes time • Have to find the find right stick • Passwords tend to be forgotten • At clients’ premises, a forgotten password is very embarrassing, particularly in the City of London culture.
Trade-offs • In fact, there is a trade-off between security (confidentiality) and productivity • The nature of this trade-off can be analysed using methods from economics • The key idea is that of a utility function
Utility (again; cf. Security lectures) • In economics, utility theory is used to understand how agents use (expected) valuations of (expected) outcomes to make decisions/choices • To use utility theory, it’s necessary to understand the problem in a fair degree of detail, but also to remember to stick with the level of abstraction that’s appropriate for what you’re trying to achieve • Einstein: A scientific theory should be as simple as possible, but no simpler. Can be abused by the lazy, but applies well to modelling.
So, identify which resources you care about • Identify what else in the model affects their values • Typically, there will be a trade-off between some of things you care about, such as confidentiality and productivity • BUT, you might not care about all things to the same extent: e.g., weightings for conf. and prod.
Shape of Utility • Associated with each of confidentiality and productivity, and indeed cost/investment, might be a target level • Targets can be missed both above and below
As manager, you might also care more about some of confidentiality, C, productivity, P, and investment, K, than the others. So the utility function gives different weightings • Overall U(C, P, K) = w1 f1 (C) + w2 f2 (P) + w3 f3 (K) • Each of C, P, K depends on the system itself • Compare with Security notes
The can explore how the utility function changes as the system is reconfigured • This approach used to explore the value of applying encryption to the USB sticks used by the bank’s employees
Conclusion of USB study • Encryption is only justified − in terms of the trade-off between confidentiality, productivity, and cost − if the bank’s staff includes traitors who are deliberately trying to undermine its security • In which case, they’ll find other ways anyway … • Of course, different preferences, such as a strong preference for C over P, might produce different answers.
Modelling the Human and Technological Costs and Benefits of USB Memory Stick Security. Proc. WEIS 2008. In Managing Information Risk and the Economics of Security. M. Eric Johnson (editor), Springer, 2009: 141-163. • Available from http://www.abdn.ac.uk/~csc335/pym-weis-2008.pdf