260 likes | 401 Views
“ It is insufficient to protect ourselves with laws; we need to protect ourselves with mathematics.” ---Bruce Schneier in ‘Applied Cryptography’, pp xx .
E N D
“It is insufficient to protect ourselves with laws; we need to protect ourselves with mathematics.” ---Bruce Schneier in ‘Applied Cryptography’, pp xx
Security Planning A Revision Components of security planning: • assessing the threat, • writing a security policy: a statement of what is allowed and what is not allowed; assigning security responsibilities. • Choosing the mechanism, tools and methodologies to implement the policy
Types of Attack A Revision Most Internet security problems are • access control or • authentication ones • Denial of service is also popular, but mostly an annoyance Types of Attack • A Passive attack can only observe communications or data • An Active attack can actively modify communications or data • Often difficult to perform, but very powerful – Mail forgery/modification – TCP/IP spoofing/session hijacking
Attackers • External Attackers (through wired part of Internet): Class 1 • External Attackers (through wireless part of Internet): Class 2 • Internal Attackers (through wired segment of the LAN): Class 3 • Internal Attackers (through wireless segment of the LAN): Class 4
5 Stages of an Attack The first three • Reconnaissance: To find out about hosts, services and application versions: high probability of detection • Exploitation: To enter the network to use the services (without legitimate authorization) or to subvert the services: medium probability of detection • Reinforcement: To retrieve tools to elevate access rights and to hide the intrusion: • If tools are encrypted difficult to detect; • can be detected by keeping a watch on the outbound activity of servers
5 Stages of an Attack The last two • Consolidation: to communicate by using a secret channel (back-doors): may be detected through traffic profiling. • Pillage: to steal information or to damage the asset: may be detected through traffic profiling. Reference for the last three slides: Classification by Richard Bejtlich, “ The TAO of Network Security Monitoring”, Addison Wesley, 2005, pp45, pp 19
Additional terminology • Shoulder Surfing: A hacker reads, when the user is writing on a paper or when he is typing on a keyboard. • Pulsing zombie: A compromised computer (zombie), which is used for intermittently attacking other targets • Snoop Server: a server put in a promiscuous mode for accessing all the data in each network packet; used for surveillance
Additional terminology continued • Back Orifice: a window application, which allows a hacker at one computer to control a remote computer; written by a hackers’ group called “the Cult of the Dead Cow” • War Driving: Unauthorized access into the wireless net of a company by parking a car outside the building of the company • Smurf attack: DoS attack mounted through a ping addressed to an IP broadcast address; the resultant echo may flood the net
Additional terminology continued 2 • Hacktivism: intrusion done as a protest; justified as free speech • Rootkit: the tools installed on a computer to hide the presence of an intruder • Symantec Definition: A rootkit is a component that uses stealth to maintain a persistent and undetectable presence on a computer. "Actions performed by a rootkit, such as installation and any form of code execution, are done without end-user consent or knowledge." -- Ryan Naraine,” When's a Rootkit Not a Rootkit? In Search of Definitions,” eWeek, Jan18, 2006 • Pete Allor, director of operations, IT-ISAC ( Information Sharing and Analysis Center): working on a definition of Rootkit.
Security TheoriesRef: Matt Bishop, “Computer Security: Art & Science,” Addison-Wesley 03 • Given: A computing system ( with computers, networks etc) • To Find: Is it (provably) secure? • Answers: • 1976: Harrison, Ruzzo and Ullman: In the most general abstract case, the security of computer systems was undecidable. Reference: M. Harrison, W. Ruzzo and J. Ullman, “Protection in Operating Systems,” Communications of the ACM 19 (8), pp.461-471 (Aug. 1976).
Security Theories: Answers … continued • Jones, Lipton and Snyder: presented a specific system, in which security was decidable --- in a time period, which increased linearly with the size of the system. Reference: A. Jones, R. Lipton and L. Snyder, “A Linear-Time Algorithm for Deciding Security,” Proceedings of the 17th Symposium on the Foundations of Computer Science, pp.33-41 (Oct. 1976).
Security Theories: Answers … continued 2 • Minsky: presented a model – to examine why in the general case the security was undecidable and in a specific case it was. Reference: N. Minsky, “Selective and Locally Controlled Transport of Priveleges,” ACM Transactions on Programming Languages and Systems6 (4), pp.573-602 (Oct. 1984). • Sandhu: Extended the Minsky model and presented further insights. Reference: R. Sandhu, “The Schematic Protection Model: Its Definition and Analysis for Acyclic Attenuating Schemes”, Journal of the ACM35 (2), pp.404-432 (Apr. 1988).
Security Policy • Study needs of an organization Security Policy Mechanism -- Procedural -- Technical -- Physical
Definitions Consider a computer system as a FINITE STATE AUTOMATON with Transition Functions that change state. • Security Policy: A statement that partitions the system into sets of • authorized or secure states; (called S in slide 38) • unauthorized or secure states. (P – S) • A Secure System: One that starts in an authorized state and cannot enter an unauthorized state. • A Security Incident: When a system enters an unauthorized state.
Definitions: Confidentiality and Integrity X: a set of entities; I: some information or resource • I has the property of confidentiality wrt X, if no member of X can obtain information about I. • I has the property of integrity wrt X, if all members of X trust I.
TRUST Trust that • Conveyance and storage of I does not change the information or its trustworthiness Data Integrity; • I is correct and unchanged,if I is information about the origin of some thing or about identification of an entity Authentication • The resource functions correctly, if I is a resource rather than information Assurance
Definitions: Availability X: a set of entities; I: some resource • I has the property of availability wrt X, if all members of X can access it. Meaning of access: depends upon • needs of members of X • nature of resource • use to which the resource is put
Security Policy: Confidentiality, Integrity The policy considers the issues of CIA as follows: • confidentiality • During information flow • For environment, which changes with time ( Example: a contractor bound by non-disclosure agreement, during the period of contract) • Integrity • Authorized ways of altering information • Entities authorized to alter it • Principle of “separation of duties”
Security Policy: Availability • Availability • Services that must be provided • Parameters within which the services will be accessible (Example: A browser may download web pages but not java applets.) • QoS issues Assumptions: The context of the policy: • laws, • organizational policies and • other environmental factors
Example:Policy vs. Mechanism • University Rule: No cheating is allowed. • School of CS Procedures: Students should write the programs on the School computers and every such file should be read-protected so that other students are not able to read it. • Example: A forgets to read-protect his file. B copies it. The copying is caught. • B claims: The policy does not prohibit copying of a file. So he is not guilty. The policy says that one should read-protect the file. So A is guilty. IS B GUILTY? • A security mechanism: an entity or procedure to enforce some part of policy.
Security Models Security Model: • represents a policy or a set of policies. • Helps analyze specific characteristics of policies. • No single non-trivial analysis can cover all policies. • By restricting the class of policies, a meaningful analysis may be possible.
Confidentiality Policies: Bell-LaPadula ModelRef: D.Bell, L.LaPadula, “Secure Computer System: Mathematical Foundations,” Technical Report MTR-2547, Vol. I, MITRE Corporation, Bedford, MA (Mar. 1973) Confidentiality classification: linearly ordered sensitivity levels • Subject: security clearance • Object: security classification • Goal: To prevent read access to objects at a security classification higher than the subject’s clearance. McLean’s questions about B-P model (and the B-P responses) essentially led to the IEEE Computer Security Fundamentals Workshops.
Research: the Theory of Security Systems • June 1988: First IEEE Computer Security Foundations Workshop: held at The Franconia Inn, New Hampshire ( The Workshop: referred to as “Franconia” even today). (The preface of the Proceedings, written by the workshop Chair, Jonathan Millen, refers to another workshop on the “Foundations of Secure Computation” 1977.) • 19th IEEE Computer Security Foundations Workshop (CSFW 19), July 5 - 7, 2006, Venice, Italy, sponsored by the Technical Committee on Security and Privacyof the IEEE Computer Society
Integrity Policies: Biba Integrity ModelRef: K.Biba, “Integrity Considerations for Secure Computer Systems,” Technical Report MTR-3153, MITRE Corporation, Bedford, MA (Apr. 1997). • Goal of the model: To find answers to: “ Has the integrity of a piece of software or of data, on which the software relies, been compromised?” for software, that exhibit specific properties. • Principle of separation of duties, wherever two or more steps are required for a critical function • Principle of separation of functions (Ex.: Development, testing, deployment, certification) • Requirements of auditing, extensive logging, recovery and accountability
The Biba Integrity Model S: a set of subjects; O: a set of objects; I: a set of integrity levels. • s Є S can read o Є O, iff i(s) ≤ i(o). • s Є S can write to o Є O, iff i(o) ≤ i(s). • s1Є S can execute s2Є S , iff i(s2) ≤ i(s1).
Data Access Controls: Privacy issues • Mandatory Access Controls (MACs) • Discretionary Access Controls (DACs) Many questions?: • Should MACs or DACs be exercised by the owner, the originator, the creator or all? • Are temporal changes required in access rights? • Conflict of Interest issues