370 likes | 492 Views
Biometric Key Cryptography. Term Project Presentation. Hadi Ahmadi Biometric Technologies CPSC 601.20 Spring 2008. Preview. Biometric ergonomics and cryptographic security are highly complementary, hence the motivation for the project. In this seminar, we discuss:
E N D
Biometric Key Cryptography Term Project Presentation HadiAhmadi Biometric Technologies CPSC 601.20 Spring 2008
Preview • Biometric ergonomics and cryptographic security are highly complementary, hence the motivation for the project. • In this seminar, we discuss: • a dynamic hand signature hashing algorithm, based on a heuristic approach • a face hashing algorithm, based on dimensionality reduction • general biometric key extraction based on fuzzy extractors • Objective: • Study their performance • Evaluate their security • Find the most reliable technique for for prospective biometric hashing methods
Outline • An Introduction to Biometric Key Cryptography • A Brief Literature Review • Online hand signature hashing [8] • Face hashing method [13] • Fuzzy extractors [3] • Conclusion • References
Introduction to Biometric Key Cryptography • Cryptographic security: generating user keys • Small keys (passwords) • Memorisable • Low entropy • Easily compromised • Easily stolen • Long keys (phrases) • High entropy • Non easily stolen • Non memorisable • Easily compromised • Biometric • High entropy • Non need for being memorized • Non easily stolen • Non easily compromised
Introduction to Biometric Key Cryptography • Cryptography traditionally relies on strings being: • uniformly distributed • precisely reproducible • However, biometric data (readings) is: • redundant • Not uniformly random data • rarely identical, even though they are likely to be close • tolerating a (limited) number of errors in the biometric data greater security than provided by short passwords
Introduction to Biometric Key Cryptography • Two stages of generating cryptographic keys from biometric measurements: • Using features of raw input to compute a bit-string • Can either benefit from current biometric verification techniques (the feature extraction part) or be implemented heuristically. • Related to performance. • Developing a cryptographic key from the bit-string • Must be a one-way and small-output algorithm • Related to security.
Literature review • Davida et al. [1] (1998): • Second-stage strategy • Majority decoding and error-correcting codes • iris scans • Soutar et al. [16] (1999): • Heuristic complete strategy • Optical computing techniques • Fingerprint • Monrose et al. [11] (1999): • Heuristic complete strategy • Developed mechanism, called hardened password • Keystroke dynamics of the user • Ngo et al. [13] (2006): • Second-stage strategy • dimensionality reduction • face hashing • Kuan et al. [8] (2007): • Heuristic complete strategy • Combining function-based feature-extraction and random key mixing • Dynamic hand signature hashing. • Dodis et al. [3] (2004 & 2008): • Second-stage strategy • fuzzy extractors • All types of biometric trait
Online Hand Signature Hashing [7] Enrolment
Only a real DWT-DFT compression Biometric alone approach Best results in all attack scenarios
Experimental Results • The authors claimed their scheme is secure since: • If the BioPhasor vector h and the genuine token T are known, recovering the biometric feature b exactly cannot be performed in polynomial time, that is, intractable problem. • 2Ndiscretization is an irreversible process. • The sequence of BioPhasormixing and 2Ndiscretization obeys the product principle, that is, the proposed scheme is a one-way transformation.
My Security Evaluation of the Paper • arctan(x) is used to satisfy security of random mixing by improving nonlinearity. • However, for x>5 arctan(x)≈π/2 and for x<1 arctan(x)≈x. • They have not consideredthis property!! • Considering random mixing, a user biopahsor vector could be totally different each time (hence, different hash vectors)!!
My Security Evaluation of the Paper • A potential attack on the system: • Note that there are two main security criteria for hash functions: preimage and collision resistance • All the claims are related to preimage resistance (one-way function). • Hence, we investigate the second criteria. • The authors claim their system is secure since knowing token T and BioPhasor h, one cannot find biometric features (vector) b. • We assume this proposition is true. However, is it necessary to find b to attack the system????
My Security Evaluation of the Paper • Suppose an adversary can find two biometric sets of elements {bj, bk} and {b’j, b’k} for two given sets of random elements {ti,j, ti,k} and {t’i,j, t’i,k}, s.t. : • Hence, they can generate two different signatures resulting to the same hash Biophasor value same hash value. • Useful relations: • Approximation of Taylor series • arctan characteristic:
Face Biometric Hashing [11] • The techniques used in this method: • dimensionality reduction • error correction • random projection and orthogonalization • Projection stages: • Linear projection • Principal Component Analysis (PCA) [21] • Fisher Linear Discriminant (FLD) [18] • Wavelet Transform (WT) [19] • Wavelet Transform with PCA (WT PCA) [4] • Wavelet Transform with Fourier-Mellin (WT FMT) [10] • Random Projection • Token key extraction • Inner product • Threshold-based
The proposed method • Suppose α be a real nc-element projected vector (PCA, ...) and kt is the user-specified token key. 1. Using kt, compute nc real random vectors in nc-space {β1, ..., βnc} 2. Apply the Gram-Schmidt process to transform them into an orthonormal set {ξ1, ..., ξnc} 3. Compute {m1, ..., mnc}= {<α, ξ1>, ..., <α, ξnc>}. 4. Compute the nc-elementhash vector (b) • μ is selected so that on average half of the bits are zeros and half are ones.
Experimental results • The theoretical results are evaluated on the FERET face database • PCA, WT_PCA, FLD, WT, WT_FMT methods are improved by 98.02%, 95.83%, 99.46%, 99.16%, and 100%. • Among the raw methods, PCA provides the poorest results. However, biometric hashing is not able to significantly improve it. • Best results are obtained with FLD_BH where EER is reduced from 5.59\% (in FLD) to 0.03\%.
My Security Evaluation of the Paper • Two stolen token attack scenarios. • Fist attack: The knows the random vectors ξi, and observes the hash vector b. since they do not know μfinding biometric data (α) seems to be a hard problem; hence the method is strong against this attack (one-way function) • Second attack: the attacker tries to generate two forged biometric data α1 and α2 resulting in the same hash. • α1 and ξi a threshold μ and a hash vector b. • Now the attaker should find another biometric data which satisfies the following set of linear non-equations (easily solvable); hence the collision attack is successful.
Fuzzy Extractors [3] • A secure sketch: • Produces public information (s) about its input (w) • Recover w, given w’ close to w and s. • A fuzzy extractor: • Is designed using a secure-sketch • Generates a randomness (R) and a public string (P) from its input (w). • Reproduce R, given w’ close to w and P. • Fuzzy extractors are designed to be information-theoretically secure, thus allowing them to be used in cryptographic systems without introducing additional assumptions.
Useful definitions • Hamming metric: dis(w,w’) = the number of positions in which the two strings differ. • Set difference metric: • Edit metric: the smallest number of character insertions and deletions needed to transform w into w’. • (n,k,d)-code is subset of Fk elements in a Fn space with mimum hamming distance d; able to detect d-1 error and correct up to [(d-1)/2] errors. • Universal hash functions • are a set of functions {Hx : {0,1}n {0,1}l| x є X} s.t.:
Construction of fuzzy extractors from secure sketches • Gen(w; r; x): • P = (SS(w; r), x) • R = Ext(w; x) • output (R,P) • Rep(w’; (s, x)): • recover w = Rec(w’; s) • output R = Ext(w; x). • “From now on we just need to construct secure sketches”
Constructions for Hamming Distance • Code-Offset construction: • On input w, select a random codeword c • SS(w) is the shift needed to get from c to w: s=w-c • w=Rec(w’, s) is decoding of c’=w’-s by a (n,k,2t+1)-code • Syndrome construction: • Useful for linear codes • S=SS(w) = syn(w) • Rec(w’,s) is obtained by: • finding error e, s.t: syn(e)=syn(w’)-s • then w = w’ − e.
My Security Evaluation of the Paper • Fuzzy extractors are designed to be information-theoretically secure. • They have provable security. • This implies no one can find a successful attack on them. • Although there are no implementation results on biometric verification by the authors, the introduced techniques are so general and interesting that they seem to be applicable to modify current biometric verification methods in a way to satisfy security as well as performance.
Conclusion • The advent of biometrics has introduced a secure and efficient alternative to traditional authentication schemes. As a consequence of this discussion, biometric verification is a method of authentication which is expected to enhance security. • On the other hand, a typical biometric verification system is susceptible to various types of threats, among which compromising template information is one of the most important ones. Thus, template-generating algorithms are expected to serve as cryptographic one-way algorithms. • These two statements implies a mutual relation between “biometric verification” and “biometric key cryptography”.
Conclusion • The issues of key generation systems which can be removed using biometrics: • Key entropy: user password are small • Key memorization: long-term keys cannot be memorized • Key compromising: password/token are easily compromised • The issues of biometric verification systems which can be removed by biometric hashing: • Security: one-wayness and collision • Storage size: small random hash with high entropy
Conclusion • The first two papers have tried to heuristically design a secure and efficient biometric verification system based on biometric hashing. • The introduced methods are found to be efficient due to experimental results; however they seem to have some security vulnerabilities which contradict the authors' claims. • The third paper introduces some general approaches for extracting keys from sources such as biometrics. • The introduced methods are not supported by any supplemental implementation results. However, since they have a provable security, one can try using them in specific biometric verification techniques to satisfy security as well as performance.
References [1] Davida G.I., Frankel Y., and Matt B.J., “On Enabling Secure Applications through Offline Biometric Identification", Proceedings of the 1998 IEEE Symposium on Security and Privacy, pp. 148-157, 1998. [3] Dodis Y., Reyzin L., and Adam Smith. “Fuzzy extractors:How to generate strong keys from biometrics and other noisy data”. In Christian Cachin and Jan Camenisch, editors, Advances in Cryptology|EUROCRYPT 2004, volume 3027 of Lecture Notes in Computer Science, 2004. [4] Feng G.C., Yuen P.C., Dai D.Q., “Human Face Recognition Using PCA on Wavelet Sub-band”, Journal of Electronic Imaging, vol. 9, no. 2, pp. 226-233, 2000. [8] Kuan Y., Teoh A., Ngo D., “Secure hashing of dynamic hand signatures using wavelet-Fourier compression with biophasor mixing and 2N discretization”, EURASIP Journal on Applied Signal Processing 2007(1): 32-32, 2007. [10] Luo X., Mirchandani G., \An Integrated Framework for Image Classi¯cation", Proceedings of the IEEE International Conference on Acoustics,Speech and Signal Processing (ICASSP 2000), 2000.
References [11] Monrose F., Reiter M.K., and Wetzel S., “Password Hardening Based on Keystroke Dynamics”, Proceedings of the 6th ACM Conference on Computer and Communications Security, pp. 73-82, 1999. [13] Ngo, D.C.L. Teoh A.B.J., Goh A., “Biometric hash: high confidence face recognition”, Circuits and Systems for Video Technology, IEEE Transactions on , vol.16, no.6, pp. 771-775, 2006. [16] Soutar C., Roberge D., Stoianov A., Gilroy R., and Vijaya Kumar B.V.K., “Biometric Encryption", ICSA Guide to Cryptography, R.K. Nichols, ed., McGraw-Hill, New York, pp. 649-675, 1999. [18] Swets D.L., Weng J.J., “Using Discriminant Eigenfeatures for Image Retrieval”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 8, pp. 831-836, 1996. [19] Tang J., Nakatsu R., Kawato S., Ohya J., “A Wavelet-Transform Based Asker Identification System for Smart Multipoint Teleconference”, Journal of the visualization society of Japan, vol. 20, no. 1, pp. 303-306, 2000. [21] Turk M., Pentland A., “Eigenfaces for Recognition”, Journal of Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991.