720 likes | 889 Views
Development of Portable and Mobile Iris Identification and Authentication System. Faculty of Information Technology. Biometrics. Automated personal identification or authentication based on:. Behavioral characteristics:- For example: hand writing, speech, signature and etc.
E N D
Development of Portable and Mobile Iris Identification and Authentication System Faculty of Information Technology
Biometrics • Automated personal identification or authentication based on: Behavioral characteristics:- For example: hand writing, speech, signature and etc. Physiologicalcharacteristics:- For example: iris, face, fingerprint and etc.
Portable Iris Recognition Wireless Connection PDA Input Internet Connection Output Acquired Iris Image Iris Authentication Server
Human Iris • It controls the amount of light which enters the eye through the pupil. • The normal appearance of the pupil is black. • The back portion of iris (posterior pigment epithelium) appears dark and black. • The iris portion can be occluded by noises like eyelids and eyelashes. • There is a high contrast between the sclera and the back portion of iris.
Advantages of Iris Recognition • The uniqueness of every iris was reported in [1] that parallels the uniqueness of every fingerprint. • It is stable throughout life [1, 2]. • It is non -invasive and contactless (no transfer of disease). [1] J. Daugman, “How iris recognition works? ,” IEEE Trans. Circuits and Systems for Video Technology, vol. 14, no. 1, pp. 21 - 30, Jan, 2004. [2] L. Ma, T. Tan, Y. Wang and D. Zhang, Personal Identification Based on Iris Texture Analysis. IEEE Trans. PAMI, vol 25, no.12, pp. 1519 - 1533,2003.
Image Acquisition • Devices embedded with near infrared light which can resolve minimum 50 pixels in iris radius. • It is the most important process as the bad quality of the iris image will affect the entire iris recognition process. • There are three kinds of bad quality iris images[2]: They are defocused iris image, motion blurred iris image and occluded iris image. [2] L. Ma, T. Tan, Y. Wang and D. Zhang, Personal Identification Based on Iris Texture Analysis. IEEE Trans. PAMI, vol 25, no.12, pp. 1519 - 1533,2003.
4 cm 47-53 cm
Sample Iris Images Sample iris images from (a) CASIA, (b) MMU1 and (c) MMU2.
MMU2 iris database Collected using Panasonic Authenticam Consisted 995 iris images Contributed by 100 volunteers from Asia, Middle East, Africa and Europe.
Iris Recognition Process Genuine or Imposter
Iris Localization • Detect the inner boundary of iris. • Detect the outer boundary of iris.
Literature on Iris Localization • Daugman [1] made use of integro - differential operators for iris localization. Where G(r) is a smoothing operator and the integration is performed along a circle arc ds, with respect to increasing radius r and centered at (x0, y0). [1] J. Daugman, “How iris recognition works? ,” IEEE Trans. Circuits and Systems for Video Technology, vol. 14, no. 1, pp. 21-30, Jan, 2004.
Wildes [3] implemented a gradient - based edge detector (a generalized Hough transform) to detect local boundaries of an iris. • Generalized Hough transform: ( x – a ) 2 + (y – b) 2 = r [3] R. Wildes, “Iris Recognition: An Emerging Biometric Technology”, in Proc. IEEE, vol 85, no.9, pp. 1348-1363, 1997.
Lee [4] implemented Canny edge detector and maximum vote finding method to localize the inner boundary of iris. • The maximum centre (X0, Y0) of occurrence calculated from two detected feature points for each vertical and horizontal line represents the centre of the pupil. • Hough Transform was used to detect the outer boundary of iris. [4] P.S. Lee and H.T. Ewe, “Individual Recognition Based on Human Iris Using Fractal Dimension Approach”, in Proc. ICBA, LNCS 3072, pp. 467-474, 2004.
Tisse [5] implemented a combination of integro-differential operators and Hough transform for iris localization. • The center of pupil was detected using Hough Transform. • Integro - differential operators was used to localize the pupil boundary and iris boundary. [5] C. Tisse, L. Martin, L. Torres, and M. Robert, “Person Identification Technique using Human Iris Recognition”, in Proc. Vision Interface, pp. 294-299, 2002.
Ma et al. [9] approximated the center coordinate (xp, yp) of the pupil before edge detection and Hough transform. • The center of pupil was estimated again by adaptively select a reasonable threshold using gray level histogram within a 120 x 120 region centered at (xp, yp) [9] . • The pupil boundary and iris boundary were localized using canny edge detector and Hough transform. [9] L. Ma, T. Tan, Y. Wang and D. Zhang, “Efficient Iris Recognition by characterizing Key Local Variations”, IEEE Trans. IP, vol 13, no.6, pp. 739-750, 2004.
Normalization • The extracted iris portion is translated into rectangular form using polar transformation. • This is due to the processing time in circular iris region is computational expensive as the repeated rectangular to polar conversions are needed. • The resulting rectangular template is normalized into standard size using interpolation technique.
Feature Extraction and Matching • Daugman [6] made use ofGabor wavelet and phase demodulation process to perform the feature extraction. • The pictorial information of iris region was projected onto multi-dimensional Gabor wavelet during phase demodulation process [6] J. Daugman, “The importance of being random: Statistical principle of iris recognition ,” Pattern Recognition, vol. 36, no. 2, pp. 279-291, 2003.
A total number of 2048 bits was extracted and used to illustrate an iris code in binary form. • Hamming distance was calculated by an exclusive OR operator, which measured bit – by - bit disagreement between a pair of iris codes.
In Wildes’s system[3], the iris image was represented by multi-dimensional decomposition (Laplacian pyramid) and computed at four different resolutions. • Fisher linear discriminant is used in the matching. [3] R. Wildes, “Iris Recognition: An Emerging Biometric Technology”, in Proc. IEEE, vol 85, no.9, pp. 1348-1363, 1997.
Tisse et al [5] used a combination of original signal and its Hilbert transform to extract the instantaneous phase within local region of an iris. The rest of their algorithm was similar to Daugman [6]. [5] C. Tisse, L. Martin, L. Torres, and M. Robert, “Person Identification Technique using Human Iris Recognition”, in Proc. Vision Interface, pp. 294 - 299, 2002. [6] J. Daugman, “The importance of being random: Statistical principle of iris recognition ,” Pattern Recognition, vol. 36, no. 2, pp. 279 - 291, 2003.
In Lee’s [4] works, a three-dimensional representation was constructed from the normalized multi-dimensional information, and measured the dimension (D) in fractal surface within selected window size, where • Lastly, the matching was based on the threshold and agreement ratio which was measured by exclusive OR operator. [4] P.S. Lee and H.T. Ewe, “Individual Recognition Based on Human Iris Using Fractal Dimension Approach”, in Proc. ICBA, LNCS 3072, pp. 467 - 474, 2004. D =
Noh [7] used discrete wavelet frame decomposition to extract the local features and global features. • A modified geometric moment was used for representing the global features. • The global distance between the testing image and the training image was measured. Local matching was performed using Hamming distance if the distance was smaller than a threshold value. [7] S. Noh, K. Bae, and J. Kim, “A novel method to extract features for iris recognition system ,” in Proc.4th Int. Conf. Audio- and Video-Based Biometric Person Authentication, pp. 838 - 844, 2003.
Boles et al [8] developed an iris recognition system based on zero-crossing of one-dimensional wavelet transform to construct the iris code. • The iris matching was based on the two dissimilarity functions between normalized zero-crossing representation of unknown user, and normalized zero-crossing representation of known user. [8] W. Boles, and B. Boashash, “A Human Identification Technique Using Images of the Iris and Wavelet Transform,” IEEE Trans. Signal Processing, vol 46, no.4, pp. 1185 - 1188,1998.
Ma et al. [9]decomposed angular information horizontally into one-dimensional intensity signal. • The iris sharp variation points of each intensity signal were analyzed using wavelet transform to illustrate iris code in binary form, and the original features were concatenated to create an ordered feature vector. • The matching was performed by looking at the similarity between a pair of feature vectors using the exclusive OR operator. [9] L. Ma, T. Tan, Y. Wang and D. Zhang, “Efficient Iris Recognition by characterizing Key Local Variations”, IEEE Trans. IP, vol 13, no.6, pp. 739 - 750, 2004.
Iris Localization • A combination of black hole search and integro-differential operators are used in iris localization [10]. • Firstly, a black hole search method is proposed to detect the center of a pupil and its radius. • Next, the integro - differential operators is used to localized the outer boundary of iris. [10] C.C Teo, H.T Ewe, “An Efficient One-Dimensional Fractal Analysis for Iris recognition ,” in Proc. WSCG SHORT papers, ISBN 80 – 903100 – 9 - 5, pp. 157 - 160, 2005.
Proposed Black Hole Search • For simple objects like circle and square, the center of mass is at the center of the object (Gregory A 1994). • It tries to obtain equal mass in all directions of pupil and by applying this approach, center of pupil (Cx, Cy) can be obtained. (1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9) / 9 = 5 Cx = Cy =
Algorithms • Find the darkest point of image (referred as black hole) in the global image analysis. • Determine a range of darkness (based on 1) designated as the threshold value (t) for identification of black holes. • Determine the number of black holes and their coordinates according to the predefined threshold. Calculate the center of mass of these black holes. • Construct a L x L region centered at the estimated center. • Repeat step 3 to improve the estimation of actual center of pupil.
the radius can be estimated from the given area (total number of black holes in the pupil), where Radius =
Proposed One-Dimensional Fractal Analysis • The word fractal comes from a Latin adjective, fractus. which means irregular and fragmented. • Self-similar, chaotic and complex and non integer fractal dimension are three characteristic of fractal object. • Fractal objects can be divided into two groups: natural fractal and mathematical fractal. • Natural fractal is the fractal objects that can be found in real world like cloud and mountain. • Mathematical fractal is built up from mathematical equation like Mandelbrot set and Peano curve.
Fractal geometry can intuitively be thought as irregular geometric representation in human’s iris. • The unique details of the iris generally spread along the angular direction which corresponds to the variation of pattern in vertical direction [9]. [9] L. Ma, T. Tan, Y. Wang and D. Zhang, “Efficient Iris Recognition by characterizing Key Local Variations”, IEEE Trans. IP, vol 13, no.6, pp. 739 -750, 2004.
Algorithms • The fractal information from a multi-dimensional image can be calculated by constructing it into three-dimensional representation [11], where the additional h-axis is represented by its gray levels. • The gray level value of I(x, y) for all pixels in the iris template is normalized I(x, y) = I(x, y ) * window size / 255 [11] H.K. Low, H.T Chuah, and H.T. Ewe, “A Neural Network Landuse Classifier for SAR Images using Texture and Fractal Information”, in Geocarto International, vol. 14, no.1, pp. 67 -74,1999.
the localized iris region is normalized and positioned into rectangular iris template. the pixels within each row along the angular direction are positioned into an appropriate square with L x L window size.
The fractal dimension (D) is defined as: where N is the number of scaled down copies (with linear dimension r) of the original object which is needed to fill up the original object and r = 1 / window size. each row of the iris template will produce the fractal information and form a structured feature vector (f) for particular iris. D =
Matching • f = {D1, D2, D3…, Dm-3, Dm-2, Dm-1, Dm} where Dm is the fractal information of nth row in a iris template. • Calculate the average dissimilarity between two feature vectors.
Algorithms • Initialize a tolerance threshold. In our case, we use t (= 0.0025). • Compute the absolute bit-by-bit differences between default feature vector and current feature vector. • Increase the number of occurrences for every absolute bit - by- bit differences that exceed tolerance threshold through accumulator. • Calculate the average dissimilarity between two feature vectors.
Proposed Window Average Analysis • The absolute intensity information associated with an object can vary as it can change under various illumination setting. However, the ordinal relationships among a region present some stability with such changes and reflect the intrinsic natures of the object [12]. • The absolute brightness and their relative magnitude change under different lighting condition but several pair-wise ordinal relationship stay invariant [13]. • [12] Z. Sun, T. Tan, Y. Wang and S.Z. Li, “Ordinal Palmprint Representation for Personal Identification,” Proc. Computer Vision and Pattern Recognition, pp. 294 - 299, May, 2005. • [13] J. Sadr, S. Mukherjee, K. Thoresz and P. Sinha, “The Fidelity of Local Ordinal Encoding,” In: T. Dietterich, S. Becker & Z. Ghahramani: Advances in Neural Information Processing Systems 14. MIT press: Cambridge MA, pp. 294 - 299, May, 2002.
Ordinal measures rank the difference between two regions or two windows. • The iris pattern is irregular and its local sharp intensity variations provide a background for ordinal measures. • The iris template is divided into windows and the relationship between two windows is determined.
Algorithms • Initialize a L x L window that fills up all the rectangular iris template. • Entire iris template will produce 90 x 360/ L x L binary feature vectors. • The higher number of window size (L) will produce smaller size of feature vectors.
L m = 90 / window size (L) L n = 360 / window size (L)
Calculate the average of intensities within every windows. • Encode the windows into binary form from every rows respectively. Row 1 Compare the average intensity between Window (1) and Window (2), …….., Window (n - 1) and Window (n – 2), and Window (n) and window (1)
* Window (I) = average intensity FOR I = 1 TO N DO IF I == N THEN IF Window (I) <= Window (1) THEN Window (I) = 0 ELSE Window (I) = 1 END IF ELSE IF Window (I) <= Window (I + 1) THEN Window (I) = 0 ELSE Window (I) = 1 END IF END IF END FOR