100 likes | 113 Views
Explore the viability of single-stage RBFNN for Offline Handwritten Signature Verification, analyzing WICT 2008 methodology, pre-processing, feature extraction, normalization, and classification results. Evaluate against Baltzakis & Papamarkos (2001) two-stage RBFNN and Justino et al. (2001) HMM classifier. Conclude with promising performance, future work suggestions.
E N D
George Azzopardi St. Martin’s Institute of IT geazzo@gmail.com Kenneth P. Camilleri St. Martin’s Institute of IT Dept of Systems and Control Engineering University of Malta kpcami@eng.um.edu.mt WICT 2008 Offline Handwritten Signature VerificationusingRadial Basis Function Neural Networks
Area of Focus and Objective • Area of Focus • Offline Handwritten Signature Verification (OHSV), Pattern Recognition, Behavioural Biometrics • Applications • Socially and legally accepted as a means of authentication • Financial Transactions, User Authentication, Passports, etc … • Motivation • Radial Basis Function Neural Networks (RBFNNs) are well-known for the robustness of outlier rejection • RBFNNs usually applied to Facial Expression and Face Classifications applications • RBFNN is applied by Baltzakis & Papamarkos (2001) within a two-stage neural network classifier signature verification technique • Objective • To investigate the viability of a single stage RBFNN for OHSV WICT 2008
Signature Database • No public Signature Database was available at the time of the study • Signature Acquisition • Recommendations by Mr. Joseph Gaffiero (a Maltese graphologist) and Dr. H. Baltzakis (expert in the field) • 2492 signatures from 65 signers • 40 signatures per signer (where possible) • 25 on white blank sheets • 15 within randomly-sized frames • 5 different days • Different pens varying in colour and point type • Use as much intrapersonal skills as possible WICT 2008
Methodology WICT 2008
Pre-Processing • Data Area Cropping • Segment the signature from background • Width Normalization • Signature image scaled (bicubic interpolation) to a constant width, keeping the aspect ratio fixed. • Binarization • 24-bit image converted to grayscale and then binarized using a histogram-based binarization • Skeletonization • Thinning the signature without losing structural information • Facilitate the extraction of morphological features WICT 2008
Feature Extraction • Based on three groups of features • Global Features (17 elements) • Information about the entire structure • Example • Signature Height, Height-To-Width ratio, etc … • Grid Features (576 elements) • Virtual grid of 8x12 cells • Pixel Density (1 feature), Pixel Distribution (4 features), Predominant Axial Slant (1 feature) • Texture Features (768 elements) • Same virtual grid of 8x12 cells • A 2x2 co-occurrence matrix is used to describe the transition of black and white pixels • Considered only p01 (black-to-white) and p11 (black-to-black) transitions (4x2 = 8 features) WICT 2008
Normalization & Vector Quantization • Global Features normalized in the range [0,1] • Get max of each global feature for all signatures and all signers • Divide each feature with the respective max • Vector Quantization used for Grid and Texture Features • K-Means Algorithm • Single codebook and 50 codewords • Classify all column vectors of all signatures and all signers • Replace each feature column vector (8x12) with the corresponding codeword • Normalize the quantized feature vectors • Grid Features • 576-element grid feature vector • 6 features x 12 columns x 8-element codewords • Texture Features • 768-element texture feature vector • 8 features x 12 columns x 8-element codewords WICT 2008
Classification • X represents every signature with n features (elements) • n is dependent on the set of features applied • Global – n is 17 • Grid – n is 576 • Texture – n is 768 • M is the number of signature models (signers); i.e. 65 WICT 2008
Results • Best Results • Combining Global and Grid features in a 593-element feature vector • Least effective features • Texture Features • FRR: 6.94% and FAR: 4.89% WICT 2008
Conclusion • A single-stage RBFNN is an effective architecture for OHSV • Performance Results • TER: 4.08% MER: 2.04% FRR: 1.58% FAR: 2.5% • The performance compares well to results reported in the literature • Baltzakis & Papamarkos (2001) - 2-stage RBFNN • TER: 12.81% MER: 6.41% FRR: 3% FAR: 9.81% • Justino et al (2001) – HMM Classifier • MER: 2.135% • Future Work • Extending the system evaluation for simple and skilled forgeries • Using an adaptive technique to calculate the required number of codebooks and codewords for VQ • Investigating feature vector dimension reduction techniques • E.g. Principal Component Analysis WICT 2008