360 likes | 451 Views
5038/2009: The Electronic Society. Systems Thinking, Systems Sciences & Systems Modelling. Systems Thinking. Systems, which perform functions and provide services, are complex assemblies and combinations of technological, human/social, economic, and policy components.
E N D
5038/2009: The Electronic Society Systems Thinking, Systems Sciences & Systems Modelling
Systems Thinking • Systems, which perform functions and provide services, are complex assemblies and combinations of technological, human/social, economic, and policy components. • How can we organize our understanding? • How can model systems so that we can explore and reason about all of the interacting and conflicting components and requirements? • How do systems fail? Systemic failure, component failure, individual culpability? • Security examples.
What is a System? • `` whole compounded of several parts or members’’ • ``a set of interacting or interdependent components forming an integrated whole’’
Types and example • Natural • Physical/chemical systems: a lot of the early ideas come from thermodynamics; the biological cell; • biological ecosystems • Synthetic • An engine; a single computer; a(n) (inter)network; a battleship; a supply chain; … • Of course, the boundaries between these categories are not sharp (e.g., What about the Gaia principle, or a decentralized economy operated by biological creatures, interconnected by a global communications network architecture with designed protocols?)
What goes into a system? • Structure • components (building blocks) • Interconnectivity • Structural relationships (e.g. hierarchical subsytsems) • [ Agents, stakeholders ] • Behaviour • Function: Input and output of whole • Information, energy, material • Dynamics: how the system changes
The elephant outside the room • Environment – Larger system within which the system of interest is embedded. Can’t think about everything at once: delimit boundary and have at most simple interactions across it. • Note that the boundary is conceptual. It can be physically inside a part of the system.
The elephant sneezes • Physics: isolated system has negligible interaction with environment. • ``the entropy of a thermally isolated system can only increase’’ (part of 2nd law of thermodynamics) • Even in physics, need models that allow for more interesting interactions with environment. • For the systems we will be interested in, pretty much never the case that the environment is negligible. • Instead, have to try to precisely delimit interaction with environment • Can be very difficult with modern systems.
Rear-Admiral Grace Hopper ``Life was simple before World War II. After that, we had systems.’’ • Aside: • Wrote the first compiler, to allow for the execution of a high-level programming language! • One of the key players in the development of COBOL. • Left us with the word ``bug’’ in computing and systems.
Systems Prehistory • Of course, we had systems before WW2. • Lots of thinkers had considered them. • Physicists (e.g., Cournot, Gibbs), mathematicians (Wiener), engineers, biologists (Darwin), economists (Keynes), social-thinkers and philosophers, politicians, generals.
So what? • In the last 100 years or so it has rapidly become possible and necessary to engineer more and more complex systems. • For correct and optimal performance of the systems we use, we need to take into account more of the environment in our `model’ of these system. • E.g. designer of some access-control system for a computer network maybe should think in detail about user behaviour and social patterns.
Apollo Program • Take-off; escape earth’s gravity; slingshot around earth; various separation phases; follow precise trajectory at precise speed to moon; separate; land; take-off; dock; return to earth; keep highly-trained human occupants alive; only just enough fuel and energy; some, but minimal compute power; mission support; communications. • Rocket > 2 million componentson vehicle alone, from over 20000 suppliers. • Command and Service Modules > ``With over 3 million components, a performance record of 99.9% would still leave 3,000 parts that could fail -- any one of which might result in the deaths of the crew.’’ • Many more components left on the ground.
Systems Engineering • How systems should be planned, designed, implemented, built, and maintained. • Need to identify and manipulate the properties of the system as a whole. • May not be straightforward to do, even when we know the component properties. • We’ll devote the next lecture to systems engineering. • Advanced engineering requires modelling methods.
Systems Modelling • Need ways to explore the consequences of decisions made about designand operation of systems, and of responses to changesin environment. • Need models, rigorously defined (mathematical, logical, computational), and grounded in data to the greatest extent possible. • Need to do explore scenarios and predictin an honest fashion. • Understandand model multiple stakeholderpreferences, and figure-out how to combine. • Analyse, visualize, optimize or satisfice (where possible). • Try to get definite conclusions, but with all the assumptions about the system laid bare. • The opposite of fortune-telling.
The Ideal Systems Modeller is: • A software engineer: requirements, ontologies, modules, classes, objects, interfaces, software engineering methods, UML diagrams, workflows etc. • A statistician: collection and analysis of numerical data. Prediction of future based on past data and trends… • A mathematician: dynamical systems theory (continuous, discrete), solutions of equations, numerical methods. … • A decision-theorist: economic models, game theory, operations research… • A social scientist: ethnography, psychology, crimin., mngmnt, law, politics. • A scientist: physics, chemistry, biology, ecology,… • An engineer: hardware, protocol knowledge, performance analysis, reliability and safety engineering • A computer scientist: programs, simulations, protocols, interactions, agents, tools…. • all at once, and able to communicate extremely effectively!
Reductionism • A lot (but not all) of science tends to be reductionist: it has afocus on breaking systems down on increasingly small parts to figure out what they do. • Collective phenomena are known: Curie point of ferromagnetic materials. • For systems, we need to understand how assemblies of simple parts behave together. • Problem: it is not always easy to understand behaviour of whole when understand behaviour of parts (e.g. weather system), and with many mod. sys. don’t understand all parts. • Does not mean that whole is more than the sum of the parts • Our model may have missed something • The `sum’ might not be as simple as we had thought • There is no such thing as magic!
Distributed Systems • Definition of a Distributed System: • A collection of autonomous information-processing devices connected by a network supporting data transmissions between devices • Managed by software that is designed to support an integrated computing facility that delivers services to users and other systems • Examples: the Internet; your home network; a bank’s account management systems, the Met Office’s network of sensors • So, different levels of abstraction matter
More abstractly … • The system has a boundary between itself and its environment • The system itself consists in • A collection of locations • A collection of resources at each location • A collection of processes that execute at locations using the available resources • The environment is represented stochastically • events begin incident upon the system according to a probability distribution.
A System Model processes manipulate resources L1 events L2 … R1 R2 … events
Example • Boats entering a harbour: • Arrive from the sea (the environment) according to an exponential distribution (simple gives an arrival rate) • Locations: holding area; jetties • Resources: tugs, cranes, stevedores • Process: a boat itself, arrives from sea collects tugs, docks at a jetty, uses a crane, collects tugs, returns to sea
A Security Example • The use of USB sticks by the employees of a major bank. • USB sticks used for good reasons. • But usage leads to a range of information security vulnerabilities. • How to protect?
USB locations Client’s Office Home Transport Each location has different vulnerabilities, threats, and protection Office
The USB Model • Process: lifecycle of a stick (cf. a boat) • The stick accesses resources at the various locations; e.g., a port on computer (cf. tug) • As the stick moves around the locations, it is subject to different threats. Examples? • Thieves, for example, might be part of the environment. So, model arrival of a thief in the same train carriage of the stick using a probability distribution • Likelihood of data loss depends on things like the probability stick’s owner used its encryption …
How to use the model? • Run simulations to understand the consequences of different design choices: a simulation modelling tool that captures this is Core Gnosis, available from http://www.hpl.hp.com/research/systems_security/gnosis.html • Use logical methods to reason about properties of the system. Don’t worry, this is beyond the scope of this course − involves heavy mathematical logic … .
Example • How can data be lost from USB sticks: • Stick lost on train • Stick corrupted by malware on a home computer • Stick connected to client’s computer, other clients’ files accidently copied • … • Solutions?
Encryption? • Is this a good solution?
Yes, because if sticks are always encrypted, then there is very little risk of date being lost
No, because encryption significantly impedes productivity: • Typing passwords takes time • Have to find the find right stick • Passwords tend to be forgotten • At clients’ premises, a forgotten password is very embarrassing, particularly in the City of London culture.
Trade-offs • In fact, there is a trade-off between security (confidentiality) and productivity • The nature of this trade-off can be analysed using methods from economics • The key idea is that of a utility function
Utility (again; cf. Security lectures) • In economics, utility theory is used to understand how agents use (expected) valuations of (expected) outcomes to make decisions/choices • To use utility theory, it’s necessary to understand the problem in a fair degree of detail, but also to remember to stick with the level of abstraction that’s appropriate for what you’re trying to achieve • Einstein: A scientific theory should be as simple as possible, but no simpler. Can be abused by the lazy, but applies well to modelling.
So, identify which resources you care about • Identify what else in the model affects their values • Typically, there will be a trade-off between some of things you care about, such as confidentiality and productivity • BUT, you might not care about all things to the same extent: e.g., weightings for conf. and prod.
Shape of Utility • Associated with each of confidentiality and productivity, and indeed cost/investment, might be a target level • Targets can be missed both above and below
As manager, you might also care more about some of confidentiality, C, productivity, P, and investment, K, than the others. So the utility function gives different weightings • Overall U(C, P, K) = w1 f1 (C) + w2 f2 (P) + w3 f3 (K) • Each of C, P, K depends on the system itself • Compare with Security notes
The can explore how the utility function changes as the system is reconfigured • This approach used to explore the value of applying encryption to the USB sticks used by the bank’s employees
Conclusion of USB study • Encryption is only justified − in terms of the trade-off between confidentiality, productivity, and cost − if the bank’s staff includes traitors who are deliberately trying to undermine its security • In which case, they’ll find other ways anyway … • Of course, different preferences, such as a strong preference for C over P, might produce different answers.
Modelling the Human and Technological Costs and Benefits of USB Memory Stick Security. Proc. WEIS 2008. In Managing Information Risk and the Economics of Security. M. Eric Johnson (editor), Springer, 2009: 141-163. • Available from http://www.abdn.ac.uk/~csc335/pym-weis-2008.pdf