8vas. Jornadas de Ciencias de la Computaci´ on (XIII JCC) XIII JCC Off-line Signature Verification: A Circular Outline Grid-Based Feature Extraction Approach Motivation Contributions Feature Marianela Parodi Extraction SVM-based < parodi@cifasis-conicet.gov.ar > Classifier Juan Carlos G´ omez Experiments and Results < jcgomez@fceia.unr.edu.ar > Conclusions Future Work Laboratory for System Dynamics and Signal Processing FCEIA, Universidad Nacional de Rosario References CIFASIS, CONICET, Argentina www.cifasis-conicet.gov.ar XIII JCC (Rosario, Argentina) October 28-29, 2010 1 / 28 October 28-29, 2010
Outline Motivation for Off-line Signature Verification XIII JCC 1 Outline Contributions 2 Motivation Contributions Circular Grid Feature Extraction Approach 3 Feature Extraction SVM-based Classifier 4 SVM-based Classifier Experiments Experiments and Results 5 and Results Conclusions Conclusions 6 Future Work References Future Work 7 References 8 XIII JCC (Rosario, Argentina) October 28-29, 2010 2 / 28
Motivation for Off-line Signature Verification Today’s society need for personal authentication has made XIII JCC automatic personal verification to be considered as a Outline fundamental task in many daily applications. Motivation Signature verification is the most popular method of Contributions identity verification. Feature Extraction Financial and administrative institutions recognize SVM-based signatures as a legal means of verifying an individual’s Classifier Experiments identity. and Results No invasive methods of collecting the signature are Conclusions needed. Future Work References The use of signatures is familiar to people in their everyday’s life. XIII JCC (Rosario, Argentina) October 28-29, 2010 3 / 28
Contributions A new feature extraction approach for off-line signature XIII JCC verification based on a circular grid is presented. Outline Graphometric features used in the rectangular grid Motivation segmentation approach are adapted to this new grid Contributions geometry. Feature Extraction A Support Vector Machine (SVM) based classifier scheme SVM-based is used for classification tasks and a comparison between Classifier Experiments the rectangular and the circular grid approaches is and Results performed. Conclusions Future Work References XIII JCC (Rosario, Argentina) October 28-29, 2010 4 / 28
Circular Grid Feature Extraction Approach A circular chart enclosing the signature is divided in N identical XIII JCC sectors, and graphometric features are computed for each sector. The circular grid is placed so that the center of the grid matches the Outline center of mass of the binary image of the signature. Motivation Contributions Feature Extraction SVM-based Classifier Experiments and Results Conclusions Future Work References Fig. 1: Features extracted from segmented sectors with the circular grid approach: (a) Segmented sector being analyzed; (b) Pixel Density Distribution; (c) Gravity Center Distance; (d) Gravity Center Angle. XIII JCC (Rosario, Argentina) October 28-29, 2010 5 / 28
Circular Grid Feature Extraction Approach (cont.) Some of the graphometric features used in rectangular grid XIII JCC segmentation are adapted to the new grid structure. Three Outline static graphometric features are considered: Motivation Pixel density distribution Contributions Feature x PD i = number of black pixels inside the sector Extraction i = 1 , ..., N total number of pixels inside the sector SVM-based Classifier Gravity center distance Experiments and Results d GCi Conclusions x DGC i = i = 1 , ..., N R Future Work References Gravity center angle α GCi being α max = 2 π x AGC i = α max , i = 1 , ..., N N XIII JCC (Rosario, Argentina) October 28-29, 2010 6 / 28
Circular Grid Feature Extraction Approach (cont.) Finally, the feature vector x sign is composed of the features XIII JCC calculated for each of the N angular sectors in which the Outline signature image is divided, i.e. Motivation Contributions AGC ] T , x sign = [ x T PD , x T DGC , x T Feature Extraction SVM-based where Classifier Experiments [ x PD 1 , x PD 2 , · · · , x PD N ] T , and Results x PD = Conclusions [ x DGC 1 , x DGC 2 , · · · , x DGC N ] T , x DGC = Future Work [ x AGC 1 , x AGC 2 , · · · , x AGC N ] T . x AGC = References XIII JCC (Rosario, Argentina) October 28-29, 2010 7 / 28
SVM-based Classifier SVM is a quite recent technique of statistical learning XIII JCC theory developed by Vapnik. Outline In recent years, SVM-based classifiers have shown a Motivation promising performance in Automatic Signature Contributions Verification. Feature Extraction Separable Case SVM-based Classifier Experiments and Results Conclusions Future Work References Fig. 2: Separable classification problem example: (a) Possible separating hyperplanes; (b) Selection of a unique hyperplane maximizing the distance between the nearest point of each class; (c) Optimal separating hyperplane that maximizes the margin. XIII JCC (Rosario, Argentina) October 28-29, 2010 8 / 28
SVM-based Classifier (cont.) Consider the training set { x k , y k } n k =1 , with input data x k ∈ R d , XIII JCC output data y k ∈ {− 1 , +1 } and suppose that all the training data satisfy the following constraints: Outline Motivation ω T x k + b ≥ +1 , for y k = +1 Contributions ω T x k + b ≤ − 1 , for y k = − 1 Feature Extraction SVM-based Then the classifier takes the form Classifier Experiments y k [ ω T x k + b ] − 1 ≥ 0 , k = 1 , ..., n. and Results Conclusions Future Work where ω is normal to the hyperplane, | b | / � ω � 2 is the perpendicular References distance from the hyperplane to the origin and � ω � 2 is the Euclidean norm of ω . XIII JCC (Rosario, Argentina) October 28-29, 2010 9 / 28
SVM-based Classifier (cont.) The margin M, in this case, equals 2 / � ω � 2 and the problem is solved XIII JCC by minimizing � ω � 2 subject to the restrictions imposed by the data, i.e. , by solving the following optimization problem Outline Motivation 2 ω T ω J P ( ω ) = 1 min Contributions ω,b Feature y k [ ω T x k + b ] ≥ 1 , s.t. k = 1 , ..., n. Extraction SVM-based Classifier Experiments and Results Conclusions Future Work References XIII JCC (Rosario, Argentina) October 28-29, 2010 10 / 28
SVM-based Classifier (cont.) Non-Separable Case XIII JCC Outline Motivation Contributions Feature Extraction Fig. 3: Non-separable classification problem example. SVM-based Classifier In the non-separable case, one cannot avoid misclassifications. Then, Experiments and Results slack variables have to be included in the formulation of the problem Conclusions Future Work y k [ ω T x k + b ] ≥ 1 − ξ k , k = 1 , ..., n. References XIII JCC (Rosario, Argentina) October 28-29, 2010 11 / 28
SVM-based Classifier (cont.) In this case, the optimization problem becomes XIII JCC 2 ω T ω + c � n J P ( ω, ξ ) = 1 min k =1 ξ k Outline ω,b,ξ Motivation y k [ ω T x k + b ] ≥ 1 − ξ k , s.t. k = 1 , ..., n Contributions ξ k ≥ 0 , k = 1 , ..., n. Feature Extraction SVM-based Classifier Experiments and Results Conclusions Future Work References XIII JCC (Rosario, Argentina) October 28-29, 2010 12 / 28
SVM-based Classifier (cont.) Non-linear Case XIII JCC The extension from the linear to the nonlinear case is straightforward. Outline The linear separating hyperplane is calculated in a higher dimensional Motivation feature space where the input data lie after being mapped by a Contributions nonlinear mapping ϕ ( x ) . Then, the classifier in the case of nonlinear data is Feature y k [ ω T ϕ ( x k ) + b ] ≥ 1 − ξ k , Extraction k = 1 , ..., n. SVM-based Classifier No explicit construction of the nonlinear mapping ϕ ( x ) is needed, by Experiments applying the so-called kernel trick. That is, by defining a Kernel as and Results K ( x k , x ℓ ) = ϕ ( x k ) T ϕ ( x ℓ ) for k, ℓ = 1 , ..., n . The SVM solution can Conclusions be found by solving the following optimization problem Future Work 2 ω T ω + c � n J P ( ω, ξ ) = 1 References min k =1 ξ k ω,b,ξ y k [ ω T ϕ ( x k ) + b ] ≥ 1 − ξ k , s.t. k = 1 , ..., n ξ k ≥ 0 , k = 1 , ..., n. XIII JCC (Rosario, Argentina) October 28-29, 2010 13 / 28
SVM-based Classifier (cont.) The SVM classifier takes the following form XIII JCC y ( x ) = sign [ � n Outline k =1 α k y k K ( x, x k ) + b ] . Motivation Contributions Different Kernels have been used in the literature to solve pattern Feature recognition problems. Linear, Polynomial and Radial Basis Functions Extraction (RBF) Kernels are among the most popular in the bibliography SVM-based Classifier x T K linear ( x k , x ℓ ) = k x ℓ , Experiments and Results (1 + x T k x ℓ ) d , K polynomial ( x k , x ℓ ) = Conclusions exp ( − � x k − x ℓ � 2 2 /σ 2 ) . K RBF ( x k , x ℓ ) = Future Work References XIII JCC (Rosario, Argentina) October 28-29, 2010 14 / 28
Recommend
More recommend