460 likes | 548 Views
CPS 210 Security in Networked Systems Always Use Protection. Jeff Chase Duke University http://www.cs.duke.edu/~chase/cps210. Malware. Botnets. [microsoft.com]. Any program you install or run can be a Trojan Horse vector for a malware payload. Confused deputy.
E N D
CPS 210 Security in Networked Systems Always Use Protection Jeff Chase Duke University http://www.cs.duke.edu/~chase/cps210
Botnets [microsoft.com]
Any program you install or run can be a Trojan Horse vector for a malware payload.
Confused deputy Mal wants the power. Can Mal trick Bob to get it? Bob has the Power. Bob wishes to hold the power and use it properly. Alice considers Bob her deputy in the use of this Power. Alice trusts Bob to deny the power to Mal. http://finntrack.co.uk/, erights.org http://www.cap-lore.com/CapTheory/ConfusedDeputyM.html
Attack scenarios we consider • Trojan horse • A threatening program is offered as a “gift”, and runs “inside the victims walls” (i.e., with victim’s identity). • Confused deputy • Attacker corrupts a “good” program and takes over its functions, e.g., to assume victim’s identity. • Confused user • Attacker tricks victim into giving away secrets. (Or victim fails to use secrets or fails to protect secrets.) • Later: DDOS, spoofing, and other network attacks
Security, an overview We reduce it to three intertwined issues: • What program am I running? • Can this program be trusted? Who says? • Can I be sure that the program has not been tampered? • Who am I talking to? • Can this entity be trusted? • Can I be sure the communication has not been tampered? • Should I approve this request? R(op, subject, object) • Who is the requester? (subject) • What program is speaking for the requester? • Does the subject have the required permissions?
Elements of security • Isolation/protection • Sandboxes and boundaries prevent unchecked access. • Integrity • Fingerprint data to detect tampering. • Encrypt data to prevent access or tampering. • Authentication • Identify a peer by proof that it possesses a secret. • Identity and attributes • Identities have credentials: names, tags, roles... • Authorization == access control • Guard checks credentials against an access policy.
Crypto primitives Encrypt/Decrypt Use a shared secret key (symmetric) or use a keypair one public, one private (asymmetric) Signing Secure hashing useful for fingerprinting data
http://blogs.msdn.com/b/sdl/archive/2008/10/22/ms08-067.aspx
Trusting Programs • In Unix • Programs you run use your identity (process UID). • Maybe you even saved them with setuid so others who trust you can run them with your UID. • The programs that run your system run as root. • You trust these programs. • They can access your files • send mail, etc. • Or take over your system… • Where did you get them?
Trusting Trust • Perhaps you wrote them yourself. • Or at least you looked at the source code… • You built them with tools you trust. • But where did you get those tools?
Where did you get those tools? • Thompson’s observation: compiler hacks cover tracks of Trojan Horse attacks.
Login backdoor: the Thompson Way • Step 1: modify login.c • (code A) if (name == “ken”) login as root • This is obvious so how do we hide it? • Step 2: modify C compiler • (code B) if (compiling login.c) compile A into binary • Remove code A from login.c, keep backdoor • This is now obvious in the compiler, how do we hide it? • Step 3: distribute a buggy C compiler binary • (code C) if (compiling C compiler) compile code B into binary • No trace of attack in any (surviving) source code
wired.com, June 2012 Reuters, June 2012
Phishing, password attacks, and other “human” attack vectors.
technology people Where are the boundaries of the “system” that you would like to secure? Where is the weakest link? What happens when the weakest link fails?
The First Axiom of Security • “Security is at least as much a social problem as it is a technical problem.” • Translation: humans are the weak link. • We will focus on the technical elements, but do not lose sight of the social dimension. • Keys left in lock • Phishing • Executable attachments • Trojan software • Post-it passwords • Bribes, torture, etc. • Etc.
Identify: Friend or Foe? Former Student
How accidents happen Former Student
Example use of fingerprint hashed This is a line from /etc/passwd for user Fred Flintstone. /etc/login uses this record to validate the user’s password. The file is public, but Fred’s password is secret. Or is it?
The story so far • Components run within contexts (isolated sandboxes). • Each component/context is associated with an identity with some attributes (subject). • Components use system calls to interact across context boundaries, or access shared objects. • Each object has some access attributes. • The system has a reference monitor and guard to check access for (op, subject, object). • Principle of least privilege limits the damage a component can do if it “goes rogue”.
Access control matrix We can imagine the set of all allowed accesses for all subjects or all objects as a huge matrix. obj1 obj2 Alice --- RW Bob RW R How is the matrix stored?
Access control matrix How is the matrix stored? ACL obj1 obj2 Alice --- RW capability list Bob RW R • Capabilities: each subjects holds a list of its rights (capabilities) and presents them as proof. • Access control list (ACL): each object stores a list of subjects permitted to access it. • Many systems use a level of indirection through attributes (e.g., roles or groups).
Android permissions http://source.android.com/tech/security/
Android permissions • A permission is a named object. • Declared by an app (or system). • Apps request the permissions they want/require/use. • System grants requested permissions according to policy at app install time. After that, the permissions don’t change. • Permissions protect interactions among app components (e.g., intents, binder RPC) • Each component states permissions required by its counterparty.
Granting permissions • A permission is bound to the provider key that signed the declaring app (or the system). • The declaring app (or system) associates a protection level with the permission. • The protection level drives system policy to grant permissions. • normal: granted on request • dangerous: requires user approval • signature: granted only to requesting apps from the same provider • system: granted only to apps installed on the system image
Authentication and integrity This is a picture of a $2.5B move in the value of Emulex Corporation, in response to a fraudulent press release by short-sellers through InternetWire in 2000. The release was widely disseminated by news media as a statement from Emulex management, but media failed to authenticate it. EMLX [reproduced from clearstation.com]
Crypto primitives Encrypt/Decrypt Use a shared secret key (symmetric) or use a keypair one public, one private (asymmetric) Signing Secure hashing useful for fingerprinting data
Cryptography for Busy People • Standard crypto functions parameterized by keys. • Fixed-width “random” value (length matters, e.g., 256-bit) • Symmetric (DES: fast, requires shared key K1 = K2) • Asymmetric (RSA: slow, uses two keys) • “Believed to be computationally infeasible” to break E D M Encrypt K1 Decrypt K2 M [Image: Landon Cox]
Asymmetric crypto works both ways Crypt E D A’s private key or A’s public key Crypt A’s public key or A’s private key [Image: Landon Cox]
Cryptographic hashes • Also called a secure hash or one-way hash • E.g., SHA1, MD5 • Result called a hash, checksum, fingerprint, digest • Very efficient SHA1 hash “Hash digest” Arbitrarily large 160 bits [Image: Landon Cox]
Two Flavors of “Signature” • A digest encrypted with a private asymmetric key is called a digital signature • “Proves” that a particular identity sent the message. • “Proves” the message has not been tampered. • “Unforgeable” • The sender cannot deny sending the message. • “non-repudiable” • Can be legally binding in the United States • A digest encrypted with a shared symmetric key is called a message authentication code (MAC). • faster, but…