1 / 18

Secure Biometric Authentication for Weak Computational Devices

Secure Biometric Authentication for Weak Computational Devices. Mikhail Atallah (Purdue) ,Keith Frikken (Purdue), Michael Goodrich (UC-Irvine), Roberto Tamassia (Brown) March 3, 2005. Introduction. Biometric Authentication Pros: Provides simple authentication mechanism

Download Presentation

Secure Biometric Authentication for Weak Computational Devices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Secure Biometric Authentication for Weak Computational Devices Mikhail Atallah (Purdue) ,Keith Frikken (Purdue), Michael Goodrich (UC-Irvine), Roberto Tamassia (Brown) March 3, 2005

  2. Introduction • Biometric Authentication • Pros: Provides simple authentication mechanism • Cons: Changing is difficult and privacy concerns • Difficulties: • Readings vary each measurement • Standard techniques such as hashing won’t work FC 2005

  3. Related Work • Many schemes • [Chaum and Pedersen, 1993] • [Davida et al, 1998] • [Bleumer, 1998] • [Davida and Frankel, 1999] • [Juels and Wattenburg, 1999] • [Davida et al, 1999] • [Juels and Sudan, 2002] • [Clancy et al, 2003] • [Impagliazzo and More, 2003] • [Kershbaum et al, 2004] • [Dodis, 2004] FC 2005

  4. Our Goals • Lightweight Authentication Scheme • Nothing more than hash functions • Smartcard based • No single point of failure • Not smartcard • Not server • Server compromise should not lead to the ability to impersonate user (even to the server) • Goal is to have a Biometric PIN for banking systems FC 2005

  5. Framework • Reader: Can be on card or other device, but this is what the user uses to read biometric • Server: Stores information about clients • Comparison Unit: Makes the comparison between the client’s information and server data and grants access • Two biometrics are “close” if their hamming distance is below some threshold (we generalize this to other distances) FC 2005

  6. Adversary Model • Adversary is defined by resources • Smartcard • Uncracked (SCU) • Cracked (SCC) • Fingerprint (FP) • Eavesdrop • Communication Channel (ECC) • Server’s Database (ESD) • Comparison Unit (ECU) = ESD+ECC+”outcome” • Malicious • Communication Channel (MCC) • Things that are outside our model • Adversaries that crack smartcard and give it back to user • Malicious Server’s Database • Malicious Comparison Unit FC 2005

  7. Security Requirements • Confidentiality: An adversary should not be able to learn the user’s fingerprint • Integrity: An adversary should not be able to impersonate the user to the comparison unit • Availability: An adversary should not be able to prevent a user from authenticating FC 2005

  8. Confidentiality • Have 3 oracles which are acceptable • Oracle A: {0,1}|f’|→{0,1} where A(f) returns true if f is a match • Oracle B: →{0,1}log|f’| where B() returns various distances between readings • Oracle C: {0,1 }|f’|→{0,1}log|f’| where C(f) returns the distance between f and f’ (this is weakly secure) FC 2005

  9. False Starts • Suppose f0 and f1 are readings of a fingerprint • How does “bank” determine if f0 is close to f1 without revealing private information • Correctness: The distance should be computed correctly • Privacy: Minimal information should be revealed about f0 and f1 FC 2005

  10. False Starts • False Start #1: • Client sends f1 to bank which compares to f0 in the clear • Correct but not private • False Start #2: • Client sends H(f1) to bank which compares to H(f0)in the clear • Private but not correct FC 2005

  11. False Starts (cont.) • False Start #3: • Client sends f1r to server that compares it to f0r • Correct as dist(f1r,f0r) = dist(f1,f0) • Kind of private: individual bits are protected, but it leaks locations where things change • False Start #4: • Client sends Π(f1r) to server that compares it to Π(f0r) for a permutation Π • Correct as dist(Π(f1r), Π(f0r)) = dist(f1,f0) • Private if permutation is only used once • If it is reused, then it has similar problems as #3 FC 2005

  12. Our Protocol • Goal is to be able to update r value and permutation Π between each authentication • Assume H is a keyed hash function • Before a round, server has • siΠi(firi),H(si),H(si,H(si+1)) • Before a round client(smartcard) has: • Πi, ri, si, si+1 FC 2005

  13. Protocol -- Authentication • Client obtains fi+1, and generates ri+1, si+2, and Πi+1 • It sends to the server Πi(fi+1ri), si, and some transaction information T • Server tests if • H(si) matches previously stored value • siΠi(fi+1ri) is close to the previously stored siΠi(firi) • If there is a match, then server temporarily performs T, and it sends H(T) back to the user FC 2005

  14. Protocol -- Update • Client tests if transaction information matches request • Yes then continue to 2 • No then abort wipe out this set of key information • Client sends to server si+1Πi+1(fi+1ri+1), H(si+1), and H(si+1,H(si+2)) • The server verifies that H(si,H(si+1)) matches the previous value • If yes, then it commits transaction and updates values • If no, it aborts FC 2005

  15. Security Summary • Confidentiality: The cases where the adversary learns the fingerprint are : (FP) or (SCC and ESD) or (SCU, ESD, and MCC) or weakly in the case of (SCU and ECU) or any superset of these cases • Integrity: The cases where the adversary can impersonate the user are : (SCU and FP) or (SCC and ESD) or (ESD and MCC) or weakly in the case of (SCU and ECU) or any superset of these cases • Availability: The cases where the adversary can deny access to the user are : (SCU) or (MCC) or any superset of these cases FC 2005

  16. Security Summary FC 2005

  17. Extensions • Extended to other distances • Storage-Computation Tradeoff: • Previous scheme requires several values to be stored on smartcard (in case of mismatches) • Can reduce storage by increasing computation (similar to SKEY) FC 2005

  18. Summary • Have introduced lightweight biometric scheme that uses only hash functions • No single point of failure • Future Work: • Must update values in our protocol FC 2005

More Related