characteristic number regression for fiducial facial
play

Characteristic number regression for fiducial facial feature - PowerPoint PPT Presentation

Characteristic number regression for fiducial facial feature extraction Presenter : Xin Fan Joint work with Prof. Zhongxuan Luo and Dr. Risheng Liu Published in IEEE TIP15 and ICME15 Outline Motivations Facial structure with


  1. Characteristic number regression for fiducial facial feature extraction Presenter : Xin Fan Joint work with Prof. Zhongxuan Luo and Dr. Risheng Liu Published in IEEE TIP’15 and ICME’15

  2. Outline  Motivations  Facial structure with geometric invariants  Facial feature extraction with geometry regressions

  3. Motivations  Fiducial facial point localization under pose/viewpoint changes (perspective transformation).

  4. Motivations  Geometry works parallel (complementary) to multi-view associations for facial analysis

  5. Motivations  Geometrical constraints are always important for facial analysis  Explicit shape modelling ASM, AAM, CLM  Implicit texture/shape regressions the mapping from texture to shape Fern regressor [CVPR’12] Classifier pruning [ICCV’13] SDM[ICCV’13] Random forest [CVPR’14] Deep networks[ECCV’14] Highly dependent on the availability of training examples  Explicit shape regressions the mapping from geometry to geometry

  6. Outline  Motivations  Facial structure with geometric invariants  Facial feature extraction with geometry regressions

  7. Facial geometry  Human faces are highly structured and present common geometries across age, gender, and race of individuals.  Eye corners are collinear [Gee94].  The lines connecting eye corners, nostrils and mouth corners are mutually parallel.  The line through eye corners is perpendicular to the line connecting the midpoints of nostrils and mouth corners  The parallelism and perpendicularity involves more points and vary with viewpoints.  The characteristic number describe the intrinsic geometry given by more points, and preserves under viewpoint changes.

  8. Characteristic ratio-Definition [Luo10] p 1 v u   p a u b v 1 1 1

  9. Characteristic ratio-Definition [Luo10] p  p 2 1 v u   p a u b v 1 1 1   p a u b v 2 2 2    p a u b v n n n

  10. Characteristic number  Definition derived from the characteristic ratio  Let P 1 ,…, P r be r distinct points on m -dimensional projective space, and these points form a loop.  There are n points lying on the segments P i P i+1     ( ) ( ) ( ) j j j Q P P   , , 1 1 i i i i i i i  The characteristic number (CN) is  i , i ( j ) r n r ;{ Q i    n ({ P j  1,..., n ):  ( j ) } i  1,..., r i } i  1 ( )  i , i  1 ( j ) i  1 j  1

  11. Characteristic number-Properties  Theorem: The characteristic number is a projective invariant [Luo14].  The characteristic number extends the cross ratio  More points are included ( r =2, n =2).  Relaxes the collinear and coplanar constraints. (1)   1,1 1   1,2 (1) P (1) P Q 1 2 (2)   2,2 (2) )   1,1 (1)   2,2 (1) (2) 2   2,1 (2) P (2) P 1  n ( P (2)  CR (1) , Q 1 1 , P 2 ; Q 1 Q 1  1,2  2,1

  12. Intrinsic properties of a curve  Characteristic number of a line[Luo10] c   ( ) ( ) a a P a u b w R 1 1   ( ) ( ) b b Q a w b v 1 1 u   ( ) ( ) c c R a v b u 1 1 Q b v w P    [ , ; ][ , ; ][ , ; ] u w P w v Q v u R 1 ( ) ( ) ( ) a b c b b b     1 a 1 1 1 ( ) ( ) ( ) a b c a a a 1 1 1

  13. Intrinsic properties of a curve  Characteristic number of a conic[Luo10]   ( ) ( ) a a c p a u b w 1 1 1   ( ) ( ) a a p a u b w r 2 2 2 2 p   1 ( ) ( ) b b q a w b v 1 1 1 u   ( ) ( ) b b q a w b v b 2 2 2 q v 1   q ( ) ( ) c c r a v b u 2 w 1 1 1   ( ) ( ) c c r a v b u 2 2 2 r 1 p     [ , ; , ] [ , ; , ] [ , ; , ] u w p p w v q q v u r r 2 2 1 2 1 2 1 2 a ( ) ( ) ( ) ( ) ( ) ( ) a a b b c c b b b b b b    1 2 1 2 1 2 1 ( ) ( ) ( ) ( ) ( ) ( ) a a b b c c a a a a a a 1 2 1 2 1 2

  14. Intrinsic properties of a curve  Characteristic number of an n- order curve [Luo10]    [ , ; ( ) , , ( ) ] a  a a w v p p 1 b n n ( ) a p  ( ) ( ) [ , ; , , ] 1 b  b ( ) b u w p p p 1 ( ) a p 1 n 2 ( ) b p 2 [ , ; ( ) , , ( ) ] c  c v u p p w 1 n ( c ) p u n For a curve of degree n , the v ( ) c p 2 ( ) c p ( b ) characteristic number p 1 n c ( a ) p  n  (  1) n . n This property does rely on the choice of the three lines, and reflects the intrinsic geometries of a curve.

  15. Extension to hypersurfaces [Luo14]  A hypersurface of degree n in the m- dimensional space not through any intersects each line precisely in P PP  1 i i i    . Then the characteristic number of w.r.t. the 1,..., j n ( ) j Q i  1,..., i r  ( 1) rn points is . , ,..., P P P 1 2 r   ( 1) ( 1) m n  Conversely, if the characteristic number is , then all the points lie on a hypersurface of degree n. This property is independent on the existence of the hypersurvface and/or curve.

  16. Facial geometry  Human faces are highly structured and present common geometries across age, gender, and race of individuals.  Eye corners are collinear [Gee94].  The lines connecting eye corners, nostrils and mouth corners are mutually parallel.  The line through eye corners is perpendicular to the line connecting the midpoints of nostrils and mouth corners  The parallelism and perpendicularity involves more points and vary with viewpoints.  The characteristic number describe the intrinsic geometry given by more points, and preserves under viewpoint changes.

  17. Facial geometry given by CN (a) (b) (c) (d)  These invariant priors, reported for the first time to our best knowledge, reflect common facial geometries similar to the collinearity but on a larger scale involving more points for more facial components.

  18. Facial geometry given by CN  Verification on LFW (10k+ images)

  19. Facial geometry given by CN  Verification on our collections

  20. Outline  Motivations  Facial structure with geometric invariants  Facial feature extraction with geometry regressions

  21. Objective  Fiducial facial point localization under pose/viewpoint changes (perspective transformation).

  22. Motivations  Geometrical constraints are always important for facial analysis  Explicit shape modelling ASM, AAM, CLM  Implicit texture/shape regressions the mapping from texture to shape Fern regressor [CVPR’12] Classifier pruning [ICCV’13] SDM[ICCV’13] Random forest [CVPR’14] Deep networks[ECCV’14]  Explicit shape regressions the mapping from geometry to geometry

  23. Formulation  Find the points satisfying the CN constraints  Gradient descent to minimize the energy (TIP’15)  Build the regression (mapping) from point configurations to target CN values. (ICME’15)

  24. Fiducial point localization with CN priors  Landmark errors by using collinearity, all CN constraints and no shape constraints [TIP’15].

  25. Localization results-PIE

  26. Localization results-children

  27. Comparisons with the state-of-the-art  Comparisons on LFW

  28. Comparisons with the state-of-the-art  Comparisons on Helen

  29. Comparisons with the state-of-the-art  Comparisons on LFPW

  30. Implicit regressions are sensitive to training sets Training: 400*15 faces all poses(15 pose) Test: 140*15 faces, for each pose 140 Normalized Errors POSE POSE POE POSE POSE POSE POSE POSE POSE POSE POSE POSE POSE POSE POSE 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 ESR 0.10 0.094 0.11 0.09 0.104 0.098 0.093 0.109 0.092 0.100 0.105 0.096 0.106 0.099 0.103 56 0 92 44 9 0 7 0 9 8 1 8 5 0 6 SDM 0.08 0.079 0.07 0.07 0.112 0.083 0.069 0.067 0.075 0.094 0.080 0.073 0.072 0.079 0.090 83 3 52 97 2 9 1 5 3 7 6 3 5 5 6 LBF 0.04 0.036 0.03 0.03 0.040 0.039 0.034 0.033 0.033 0.040 0.042 0.037 0.037 0.036 0.043 16 1 39 66 7 5 2 8 4 2 8 1 1 8 0

  31. Implicit regressions are sensitive to training sets Training: 400*3 faces of pose 1, 8 and 15 Test: 140*15 faces, for each pose 140 Normalized Errors POSE POSE POE POSE POSE POSE POSE POSE POSE POSE POSE POSE POSE POSE POSE 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 ESR 0.10 0.111 0.11 0.17 0.315 0.176 0.141 0.041 0.115 0.160 0.256 0.180 0.099 0.103 0.102 87 3 09 60 2 5 7 9 5 9 7 1 5 0 0 SDM 0.07 0.104 0.16 0.27 0.480 0.173 0.097 0.064 0.104 0.171 0.300 0.132 0.082 0.090 0.079 67 1 77 80 4 1 0 6 8 1 6 0 1 3 1 LBF 0.04 0.046 0.04 0.05 0.051 0.050 0.188 0.034 0.048 0.056 0.056 0.060 0.054 0.053 0.043 29 7 29 05 4 8 2 3 1 7 7 3 6 8 1

Recommend


More recommend