710 likes | 832 Views
Lecture4 – Introduction to Cryptography 2. Rice ELEC 528/ COMP 538 Farinaz Koushanfar Spring 2009. Outline. Public Key Encryption (PKE) Motivation, characteristics of PKE, RSA (Rivest-Shamir-Adelman) Encryption The Uses of Encryption Cryptographic Hash Functions Key Exchange
E N D
Lecture4 – Introduction to Cryptography 2 Rice ELEC 528/ COMP 538 Farinaz Koushanfar Spring 2009
Outline • Public Key Encryption (PKE) • Motivation, characteristics of PKE, RSA (Rivest-Shamir-Adelman) Encryption • The Uses of Encryption • Cryptographic Hash Functions • Key Exchange • Digital Signatures • Certificates Slides are courtesy of Leszek T. Lilien from WMich http://www.cs.wmich.edu/~llilien/
Motivation for PKE (1) • So far - cryptosystems with secret keys • Problems: • A lot of keys • o(n2) keys for n users (n * (n-1) /2 keys) • — if each must be able to communicate with each • Distributing so many keys securely • Secure storage for the keys • User with n keys can’t just memorize them • Can have a system with significantly fewer keys? • Yes!
Motivation for PKE (2) • 1976 — Diffie and Hellman — new kind of cryptosystem: • public key cryptosystem = asymmetric cryptosystem • Key pairs: < kPRIVATE, kPUBLIC> • Each user owns one private key • Each user shares the corresponding public key with n-1 remaining users => n users share each public key • Only 2n keys for n users2n = n * (1 + n * 1/n) • Since public key is shared by n people: 1 „owner” + (n-1) others = n • 1/n since each part „owns” 1/n of the public key • Even if each communicates with each • Reduction from o(n2) to o(n) ! • n key pairs are: • <kPRIV-1, kPUB-1 >, <kPRIV-2, kPUB-2>, ..., <kPRIV-n, kPUB-n>
Characteristics of PKE (1) • PKE requirements • It must be computationally easy to encipher or decipher a message given the appropriate key • It must be computationally infeasible to derive kPRIVfrom kPUB • It must be computationally infeasible to determine kPRIV from a chosen plaintext attack [cf. Barbara Endicott-Popovsky, U. Washington]
Characteristics of PKE (2) • Key pair characteristics • One key is inverse of the other key of the pair • i.e., it can undo encryption provided by the other: • D(kPRIV, E(kPUB, P)) = P • D(kPUB, E(kPRIV, P)) = P • One of the keys can be public since each key does only half of E ”+” D • As shown above – need both E and D to get P back
Characteristics of PKE (3) • Two E/D possibilities for key pair <kPRIV, kPUB > • P = D(kPRIV, E(kPUB, P)) • User encrypts msg with kPUB (kPUB” ”locks”) • Recipient decrypts msg with kPRIV (kPRIV ”unlocks”) • OR • P = D(kPUB, E(kPRIV, P)) (e.g., in RSA) • User encrypts msg with kPRIV (kPRIV ”locks”) • Recipient decrypts msg with key kPUB (kPUB ”unlocks”) • Do we still need symmetric encryption (SE) systems? • Yes, PKEs are 10,000+ times (!) slower than SEs • PKEs use exponentiation – involves multiplication and division • SEs use only bit operations (add, XOR< substitute, shift) – much faster
RSA Encryption (1) • RSA = Rivest, Shamir, and Adelman (MIT), 1978 • Underlying hard problem: • Number theory – determining prime factors of a given (large) number (ex. factoring of small #: 5 5, 6 2 *3) • Arithmetic modulo n • How secure is RSA? • So far remains secure (after all these years...) • Will sb propose a quick algorithm to factor large numbers? • Will quantum computing break it? TBD
RSA Encryption (2) • In RSA: • P = E (D(P)) = D(E(P)) (order of D/E does not matter) • More precisely: P = E(kE, D(kD, P)) = D(kD, E(kE, P)) • Encryption: C = Pe mod n KE = e • Given C, it is very difficult to find P without knowing KD • Decryption: P = Cd mod n KD = d
The Uses of Encryption • PKE is much slower than SE (symmetric E) • PKEs only for specialized, infrequent tasks • SEs – a real workhorse • Four applications of encryption (& outline) • Cryptographic Hash Functions (subsec. 2H.1) • Key Exchange (subsec. 2H.2) • Digital Signatures (subsec. 2H.3) • Certificates (subsec. 2H.4)
Cryptographic Hash Functions (1) • Integrity: • How can you be sure that a recived msg/doc was not modified by an attacker or malfunction? • Answer: use cryptography to ensure integrity • Idea: • Wax seals on letters in Middle Ages • — easy to see if broken • Cryptographic „seal” on doc/msg • — so that any change to it will be readily detected
Cryptographic Hash Fcns (2) • A technique: • compute a hash fcn / checksum / msg digest • More formally: • Problem: How to send n-bit msg so that R can easily • verify that it is intact • Solution: Send a msg of n+k bits • n bits — original msg • k bits — checksum = msg digest • Generated based on the n bits
Cryptographic Hash Fcns (3) Simple Parity for Error Detection (1) • Simple (non-cryptographic) technique: parity • Add a singleparity bit to detect if a message is correct • Example 1:odd parity Force the block of data to have an odd # of 1’s • Data = 1011 — n = 4 • Sent block = 10110— n+k = 4+1 — looked at ‘1011’, added 0 to have odd # of 1’s • Data = 0110 • Sent block = 01101 — looked at ‘0110’, added1 to have odd # of 1’s • Example 2: ASCII parity bit • ASCII has 7 bits for data, 8th bit is single parity bit • Either odd or even parity used [cf. A. Striegel, U. Notre Dame]
Cryptographic Hash Fcns (4)Simple Parity for Error Detection (2) • How parity enhances msg integrity? • Can detect error in 1 bit (or in odd # of bits) • e,.g, if R gets 01001, R knows it’s wrong (S sent 01101) • Cannot detect error in 2 bits (or in even # of bits) • Bec. parity stays OK -> undetectable integrity violation • e.g, if R gets 01011, R knows it’s wrong (S sent 01101) • Cannot repair errors either • E.g., R doesn’t know which bit in 01001 is wrong [cf. A. Striegel, U. Notre Dame]
Cryptographic Hash Fcns (5)Better Checksums against Errors & Attacks • There are better checksums than simple odd/even parity • Can detect multiple errors • Can even repair multiple errors • These checksums are to fix errors, not deal with attacks • For attacks need cryptographic checksums / strong hash functions
Cryptographic Hash Fcns (6) Strong Hash Function • Formal definition: strong hash function (cryptographic checksum) is h: A -> B such that: • For any x A, h(x) is easy to compute • For any y B, it is computationally infeasibleto find inverse of y, i.e., x Asuch that h(x) = y • It is computationally infeasibleto find a pair ofcolliding input values, i.e. x, x’ Asuch thatx≠ x’ and h(x) = h(x’) Alternate (stronger) form for (3): Given any x A, it is computationally infeasible to find x’ Asuch that x≠ x’ and h(x) = h(x’) • Due to (1) and (2), hash fcn is a one-way function [cf. A. Striegel, U. Notre Dame, Barbara Endicott-Popovsky, U. Washington]
Cryptographic Hash Fcns (7)Collisions & Attacks on Msg Integrity (1) • Note: • n bits of msg (x) mapped into k bits ofits checksum (y) • k < n => collisionsmustexist • But it is computationally infeasible to find collisions for good hash fcns • Goal of a successful attack on msg integrity: • Change msg1 in such a way that checksum remains unchanged (so R doesn’t detect the forgery) • I.e., find msg2 that collides with the original msg1 w.r.t. checksum value • Finding msg2 is computationally infeasible(for good hash) => forging msg1 undetectably is computationally infeasible [cf. A. Striegel, U. Notre Dame]
Cryptographic Hash Fcns (8)Collisions & Attacks on Msg Integrity (2) • Pigeonhole principle • n containers for n+1 objects(n pigeonholes for n+1 pigeons) => at least 1 container will hold two objects • Example: • n = length(msg) = 5, k = length(hash)= 3 • 25=32 possible msgs vs. 23=8 possible hash values => at least 4 (= 32/8) different msgshash into the same value (collisions!) • Real msgs and hash values are much longer than 5 or 3 bits! We know that collisions exist but: => much tougher to find collisions => much tougher to forge them [cf. A. Striegel, U. Notre Dame]
Cryptographic Hash Fcns (9)File Checksum • File checksum • Calculated, a fcn defined on all bits of the file • Result encrypted and stored with the file • Each time file used by legitimate users, checksum recalculated, encrypted, stored with the file • File sent to R • When file received by R: • R decrypts checksum c1 received in the file • R independently calculates file checksum c2 • If c1 = c2 => file integrity is OK • Otherwise – file integrity violated
Cryptographic Hash Fcns (10)Keyed vs. Keyless Crypto Checksum (1) • Keyedcrypto checksum • Key needed to computechecksum • Keyed hash fcns DES, AES • Use it in chaining mode: link next msg block to value of the previous msg block • Example chaining: E(current block) XOR E(previous block) => connects block to all previous blocks • If file sent, file’s checksum could be the last block • If chaining used, file checksum (=last block) depends on all previous blocks => depends on all bits of the file
Cryptographic Hash Fcns (10)Keyed vs. Keyless Crypto Checksum (2) • Keyedcrypto checksum – CONT. • Used for integrity + authentication • Integrity: checksum makes msg modification difficult • Authentication: only S and R know symmetric key R: if msg integrity is OK, it must have been sent by S
Keylesscryptochecksum No key requiredto computechecksum Keyless hash functions MD5/MD4: any msg 128-bit digest (hash, checksum) SHA/SHS: any msg 160-bit digest Other: MD2, HAVAL, Snefru, ... Used for integrity (not authentication) Integrity: checksum makes msg modification difficult (with truly public key anybody can send msg, but nobody but S can easily modify this msg) No authentication: n (or all) people know public key – R can’t prove which one of them sent a given msg Cryptographic Hash Fcns (10)Keyed vs. Keyless Crypto Checksum (2) [cf. A. Striegel, U. Notre Dame, Barbara Endicott-Popovsky, U. Washington]
Key Exchange (1) • Motivation: • X and Y don’t know each other • X needs to send protected msg to Y • E.g., shopping on a web site • can do it if can securely exchange KE • This is the problem of key exchange • Important • Hard • Circular (chicken-’n-egg) problem? • „To establish secure session need secure channel” • Circle can be broken – by public key cryptography • Can send public key even on insecure channel
Key Exchange (2)Deriving Symmetric Key via PKE (2) • Given • S and R / kPRIV-S, kPUB-S -- kPRIV-R, kPUB-R • Solution 1: • S determines secret key K • S encrypts K with kPRIV-S : C = E(kPRIV-S, K) • S sends C to R • R decrypts C to get K: D(kPUB-S, C) = K • S & R communicate using secret (symmetric) key K • BUT: Solution 1 is not good!!! • Question: Why?
Key Exchange (3)Deriving Symmetric Key via PKE (2) • Given • S and R / kPRIV-S, kPUB-S -- kPRIV-R, kPUB-R • Solution 1: • S determines secret key K • S encrypts K with kPRIV-S: C = E(kPRIV-S, K) • S sends C to R • R decrypts C to get K: D(kPUB-S, C) = K • S & R communicate using secret (symmetric) key K • BUT: Solution 1 is not good !!! • Answer: • Attacker who has kPUB-S can also perform decryption! • The easier the more people know kPUB-S • Trivial if kPUB-S is truly public
Key Exchange (4)Deriving Symmetric Key via PKE (3) • Solution 2: • S determines secret key K • S encrypts K with kPUB-R: C = E(kPUB-R, K) • S sends C to R • R decrypts C to get K: D(kPRIV-R, C) = K • S & R communicate using secret (symmetric) key K • Solution 2 is better • Only R can decode K (only R knows kPRIV-R) • ...but Solution 2 still is not quite good • Question: Why? • Hint: what about msg authentication?
Key Exchange (5)Deriving Symmetric Key via PKE (4) • Solution 2: • S determines secret key K • S encrypts K with kPUB-R: C = E(kPUB-R , K) • S send C to R • R decrypts C to get K: D(kPRIV-R , C) = K • S & R communicate using secret (symmetric) key K • Solution 2 is better • Only R can decode K (only R knows kPRIV-R) • ...but Solution 2 still is not quite good • Answer: • No msg authentication • (R has no assurance that msg was sent by S • – anybody could have encoded with kPUB-R)
Key Exchange (6)Deriving Symmetric Key via PKE (5) • Solution 3: • S determines secret key K • S encrypts K with both kPRIV-S & kPUB-R : • C = E(kPUB-R , E(kPRIV-S, K)) • S sends C to R • R decrypts C to get K: • D( kPUB-S , D(kPRIV-R, C) ) • -- order important ! make sure you see this • (see Fig. 2-11 p.78) • Solution 3 is good! • Only R can decode K (only R knows kPRIV-R) • Authentication: R is assured that S sent C • Only S could have encoded K with kPRIV-S
Digital Signatures (1) • Outline: • a. Problem Definition • b. Properties of Electronic Signatures • c. Using PKE for Digital Signatures • d. Using Hash Fcns for Digital Signatures
Digital Signatures (2)Problem Definition (1) • Motivation: • Need to sign and transmit electronic doc’s or msgs, incl. checks • Analogous to signing & transmitting „paper” letters, doc’s, etc., incl. checks • Roles of signatures (for both paper a& electronic) • Proves unforgeability of doc/letter/check • Authenticates person S who signed doc/letter/check • Provides non-repudiation: S cannot say sb else signed it • Facilitates proving integrity (e.g., 2 signed legal copies for 2 parties) • Note: signature might not identify the signing person • if not legible
Digital Signatures (3) Problem Definition (2) • Security requirements for digital signatures: • Signature will not reveal signer’s private key • Only owner of private key can produce a valid signature • Verification of a correct signature succeeds • Modification of a signed message can be detected [cf. J. Leiwo]
M Sg(S, M) Digital Signatures (4)Properties of Electronic Signatures (1) • M – msg / Sg(S, M) – signature of S on M • Note: M = C or M = P • M = P – if authentication but no secrecy needed • Required properties for electronic signatures: • Unforgeable: • Only S can produce the pair [M, Sg(S, M)] • Authenticable(can verify authenticity)/ non-repudiable: • R can verify that Sg(S,M) in [M, Sg(S, M)] comes from S • Only S could have produced M”+”Sg(S,M) • Sg(S, M) is firmly attached to M
M Sg(S, M) Digital Signatures (5)Properties of Electronic Signatures (2) • Desirable properties for electr. signatures: • Not alterable (assures „integrity”): • Once sent, M”+”Sg(S,M) cannot be • undetectably altered by S, R, or interceptor • [I’d rather consider this a part of „unforgeability” above] • Not reusable: • If M is received again, S detects that M is „old” • E.g., can’t deposit a copy of a check to „double-deposit” • Digital signature is a protocol • that mimics effect of signature on paper
P C Sg • Transmitting signed msgs with PKE • Original message: • Privacy transformation: C = E(P, KPUB-R) • Only R can decrypt it (with KPRIV-R) • Authenticity transformation = signing: • Sg = Sg(S, C) = D(C, KPRIV-S) • Only S can produce Sg(S, C) (with KPRIV-S) • Sent message: • Note: Remember that for some PKE algorithms (incl RSA): D( E(M, _), _ ) = E( D(M, _), _ ) = M (commutativity of E-D) Digital Signatures (6)Using PKE for Digital Signatures (1)
Digital Signatures (8) • Using PKE for Digital Signatures (3) • Properties: • [ C = E(P, KPUB-R) ] • [Sg = Sg(S, C) = D(C, KPRIV-S)] • Unforgeability: • If C is forged, • it will not „correspond” to Sg ( i.e., E( Sg, KPUB-S) ≠C ) • Authenticity: • If Sg is valid, S is authenticated (only S can produce valid S’s signature) • Non-repudiation (undeniability): • If Sg is valid, only S could have produced it, and have sent C”+”Sg C Sg
Digital Signatures (7) • Using PKE for Digital Signatures (2) • Transmitting signed msgs with PKE - cont. • Received msg: • [ C = E(P, KPUB-R) ] • [Sg = Sg(S, C) = D(C, KPRIV-S)] • R verifies Sg with S’s public key KPUB-S: • If E( Sg, KPUB-S) = C, then signature is valid • bec. E( Sg, KPUB-S) = E( D(C, KPRIV-S), KPUB-S) = C • R decodes C with R’s private key KPRIV-R: • P = D(C, KPRIV-R) C Sg
Digital Signatures (9) • Using Hash Fcns for Digital Signatures • Using hash fcn H in digital signatures • — signature over H(m), not over m length H(m) << length (m) • Before:Now: m m Sg(S, H(m)) Sg(S, m) m = P or m = C s = Sg DA(x) = D(x, KPRIV-A) EA(x) = E(x, KPUB-A) Note: Any alteration of m is detected by B’s „Verify” step even if m is not encoded with KPUB-B —due to use of H(m) [Fig — cf. J. Leiwo]
Certificates (1) • Outline • a. Introduction • b. Trust Through a Common Respected Individual • c. Certificates for Identity Authentication • d. Trust Without a Single Hierarchy
Certificates (2) • Introduction (1) • Need for trust in human interactions • Trust w.r.t.: • Individuals • Institutions (e.g., bank, hospital, car dealer) • Artifacts (e.g., car, Internet browser, software house) • Trust in small village vs. big city • Small village: implicit trust • Everybody knows everybody • Mr. X „feels” how much to trust Ms. Y • Big city: need to consider trust explicitly • Ask around to find trusted entities • Inquire friends, office mates, etc. about good car dealer, dentist, etc. • Check „reputation databases” • E.g., BBB=Better Business Bureau
Certificates (3) • Introduction (2) • Selected trust characteristics • Trust comes in degrees of trust • Vs. binary trust (with a single trust threshold) • Ubiquity of trust in social and artificial systems • Many users/computer systems err by trustingblindly (trust without evidence or verification!) • E.g., OS trusts all application pgms – any allowed to run • E.g., sers trust unknown web sites with personal data
Certificates (4) • Introduction (3) • Basic means of building trust toward person / institution / artifact X • Familiarity with X • Person: face, voice, handwriting, etc. • Institution: company name, image, good will, etc • Artifact: manufacturer name, perceived quality, etc • First-hand experience wih X’s activities/performance • Good or bad experience (trust or distrust grows) • Reputation of X determined by evidence / credentials • Reputation = second-hand knowledge of X’s actions/perf. • Reputation databases (e.g., BBB, industry organizations, etc.) with „good” evidence or lack of „bad” evidence) • Credentials: X’s driver license, library card, credit card • Affiliation of X with person/institution/artifact Y • Trust/distrust toward Y rubs off on X
Certificates (5) • Introduction (4) • Basic means of verifying trust toward person / institution / artifact X • „Dovyeryay noh provyeryay” („Trust but verify”, a Russian proverb) • — Ronald Reagan (at the start of historic negotiations withGornachev) • Verify one’s experience • Check own notes about X’s activities/performance • Verify reputation evidence / credentials • Call back to verify phone number • Check user feedback about quality of artifact (online) • Check reputation DB (e.g., consumer reports, BBB) for data • Verify affiliation • Check with employer if X still employed • Check reputation of Y with which X is affiliated
Certificates (6) • Introduction (5) • Often trust is based on appearance of authenticity, without careful verification • E.g., business order from Company A sent to company B • Order sent w/o careful verification of A by B • Why? • Verification is expensive • Trust prevails in business • Risk of fraud or swindle is low • B might be „insured” against being cheated • A trusted third-party intermediary assumes transaction risk • E.g., buyer’s bank guarantees a transaction payment • Appearance of authenticity can be used by fraudster
Certificates (7) • Introduction (6) • Need similarly common and efficient/effective trust mechanisms in the Cyber Space • Need somebody or something to: • assume risks • OR • vouch for • the other party • A trusted third party is a basis for trust • When two interacting parties do not trust each other sufficiently
Trust Through Common Trusted Individual (1) • Hierarchical structure of organizations • CEO / Divisions/ Departments / Groups / Projects • CEO doesn’t know engineers directly • Still, CEO controls all via intermediate managers • => hierarchy as basis for trust in an organization • Example • Ann meets Andy • Andy claims he works • for the same company • Ann can verify via common • trusted individual / trusted third party (TTP) • via Bill and Betty if Bill knows/trusts Betty • via Bill and Camilla, otherwise Camilla Betty Bill Ann Andy
Trust Through • Common Trusted Individual (2) • Analogous approach for crypto key exchange • Example • Ann and Andy want to comm- • unicate • Ann gives KPUB-Ann to Bill • Bill passes KPUB-Ann to Camilla • (or to Betty if he trusts her) • Camilla passes KPUB-Ann to Betty • Betty passes KPUB-Ann to Andy • Camilla is TTP(trusted third party) Camilla Betty Bill Ann Andy
Trust Through • Common Trusted Individual (3) • In reality need to pass more than just KPUB-Ann • Every sender attaches an evidence of identity • Ann: Statement of Identity (SoI) • Bill, Camilla Betty: Transmittal of Identity (ToI) • Andy receives KPUB-Ann with: • Ann’s proof of identity • Proof of identity for • all intermediaries • Proof that each inter- • mediary received KPUB-Ann • from trusted sender • E.g., Betty sends KPUB-Ann with the stmt: • „I am Betty and I received this key, SoI, and • 2 ToIs from a person I know to be Camilla” Camilla KPUB-Ann +SoI +2 ToIs KPUB-Ann+SoI+ToI Betty Bill KPUB-Ann +SoI +3 ToIs KPUB-Ann+SoI Ann Andy
Trust Through • Common Trusted Individual (4) • In reality need to pass more than just KPUB-Ann – CONT. • Andy can verify chain of evidence(SoI + ToI’s) • This assures Andy that key was sent by Ann and not forged • Public key authentication • (delivered by trusted people) • Binding of key to Ann • Trustworthy Ann’s • identification as sender • of this key Camilla KPUB-Ann +SoI +2 ToIs KPUB-Ann+SoI+ToI Betty Bill KPUB-Ann +SoI +3 ToIs KPUB-Ann+SoI Ann Andy
Trust Through • Common Trusted Individual (5) • Works pretty well within an org • There’s always sb common & trusted for any 2 employees(at the top or below) • Problems: • If Bill, Camilla, or Betty out of town, Ann & Andy have to wait for keyexchange • Person at the top works too hard to exchange all keys quickly Camilla KPUB-Ann +SoI +2 ToIs KPUB-Ann+SoI+ToI Betty Bill KPUB-Ann +SoI +3 ToIs KPUB-Ann+SoI Ann Andy
Trust Through • Common Trusted Individual (6) • Protocol Solving Problem 1 (TTP absence): • Idea: preauthenticated public key for (single) future use • Ann asks Bill for complete chain from top down to her • Bill provides chain: <Camilla, Bill, Ann> • Ann requests for TOIs for her SOI ahead of time • Ann receives from Bill 2 TOIs: • TOI#637: “I, Bill, gave this TOI to Ann to confirm her identity for SOI#27” + Bill’s signature • TOI#5492: “I, Camilla, gave this TOI to Bill to confirm his identity for TOI#637” + Camilla’s signature • Ann can use SOI+TOIs any time • Think about full scenario Hint: Andy prepares his SOI+TOIs ahead of time Camilla Betty Bill Ann Andy