Multi-view deep features for robust facial ...
Type de document :
Communication dans un congrès avec actes
Titre :
Multi-view deep features for robust facial kinship verification
Auteur(s) :
Laiadi, Oualid [Auteur]
Laboratoire Energie Signal Images et Automatique [Univ Ngaoundéré] [LESIA]
Institut d’Électronique, de Microélectronique et de Nanotechnologie - UMR 8520 [IEMN]
COMmunications NUMériques - IEMN [COMNUM - IEMN]
Ouamane, Abdelmalik [Auteur]
Université Mohamed Khider de Biskra [BISKRA]
Benakcha, Abdelhamid [Auteur]
Laboratoire de Génie Electrique [Univ. Biskra] [LGEB]
Tahleb Ahmed, Abdelmalik [Auteur]
Institut d’Électronique, de Microélectronique et de Nanotechnologie - UMR 8520 [IEMN]
COMmunications NUMériques - IEMN [COMNUM - IEMN]
Hadid, Abdenour [Auteur]
University of Oulu
Institut d’Électronique, de Microélectronique et de Nanotechnologie - UMR 8520 [IEMN]
COMmunications NUMériques - IEMN [COMNUM - IEMN]
Laboratoire Energie Signal Images et Automatique [Univ Ngaoundéré] [LESIA]
Institut d’Électronique, de Microélectronique et de Nanotechnologie - UMR 8520 [IEMN]
COMmunications NUMériques - IEMN [COMNUM - IEMN]
Ouamane, Abdelmalik [Auteur]
Université Mohamed Khider de Biskra [BISKRA]
Benakcha, Abdelhamid [Auteur]
Laboratoire de Génie Electrique [Univ. Biskra] [LGEB]
Tahleb Ahmed, Abdelmalik [Auteur]
Institut d’Électronique, de Microélectronique et de Nanotechnologie - UMR 8520 [IEMN]
COMmunications NUMériques - IEMN [COMNUM - IEMN]
Hadid, Abdenour [Auteur]
University of Oulu
Institut d’Électronique, de Microélectronique et de Nanotechnologie - UMR 8520 [IEMN]
COMmunications NUMériques - IEMN [COMNUM - IEMN]
Éditeur(s) ou directeur(s) scientifique(s) :
Struc, V
GomezFernandez, F
GomezFernandez, F
Titre de la manifestation scientifique :
15th IEEE International Conference on Automatic Face and Gesture Recognition (FG)
Ville :
Buenos Aires
Pays :
Argentine
Date de début de la manifestation scientifique :
2020-11-16
Titre de l’ouvrage :
15th IEEE International Conference on Automatic Face and Gesture Recognition (FG)
Éditeur :
IEEE
Date de publication :
2020
Mot(s)-clé(s) en anglais :
Feature extraction
Tensors
Measurement
Face recognition
Databases
Covariance matrices
Computer architecture
Tensors
Measurement
Face recognition
Databases
Covariance matrices
Computer architecture
Discipline(s) HAL :
Sciences de l'ingénieur [physics]
Informatique [cs]
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Réseaux et télécommunications [cs.NI]
Sciences de l'ingénieur [physics]/Traitement du signal et de l'image [eess.SP]
Sciences de l'ingénieur [physics]/Electronique
Informatique [cs]
Informatique [cs]/Intelligence artificielle [cs.AI]
Informatique [cs]/Réseaux et télécommunications [cs.NI]
Sciences de l'ingénieur [physics]/Traitement du signal et de l'image [eess.SP]
Sciences de l'ingénieur [physics]/Electronique
Résumé en anglais : [en]
Automatic kinship verification from facial images is an emerging research topic in machine learning community. In this paper, we proposed an effective facial features extraction model based on multi-view deep features. ...
Lire la suite >Automatic kinship verification from facial images is an emerging research topic in machine learning community. In this paper, we proposed an effective facial features extraction model based on multi-view deep features. Thus, we used four pre-trained deep learning models using eight features layers (FC6 and FC7 layers of each VGG-F, VGG-M, VGG-S and VGG-Face models) to train the proposed Multilinear Side-Information based Discriminant Analysis integrating Within Class Covariance Normalization (MSIDA+WCCN) method. Furthermore, we show that how can metric learning methods based on WCCN method integration improves the Simple Scoring Cosine similarity (SSC) method. We refer that we used the SSC method in RFIW'20 competition using the eight deep features concatenation. Thus, the integration of WCCN in the metric learning methods decreases the intra-class variations effect introduced by the deep features weights. We evaluate our proposed method on two kinship benchmarks namely KinFaceW-I and KinFaceW-II databases using four Parent-Child relations (Father-Son, Father-Daughter, Mother-Son and Mother-Daughter). Thus, the proposed MSIDA+WCCN method improves the SSC method with 12.80% and 14.65% on KinFaceW-I and KinFaceW-II databases, respectively. The results obtained are positively compared with some modern methods, including those that rely on deep learning.Lire moins >
Lire la suite >Automatic kinship verification from facial images is an emerging research topic in machine learning community. In this paper, we proposed an effective facial features extraction model based on multi-view deep features. Thus, we used four pre-trained deep learning models using eight features layers (FC6 and FC7 layers of each VGG-F, VGG-M, VGG-S and VGG-Face models) to train the proposed Multilinear Side-Information based Discriminant Analysis integrating Within Class Covariance Normalization (MSIDA+WCCN) method. Furthermore, we show that how can metric learning methods based on WCCN method integration improves the Simple Scoring Cosine similarity (SSC) method. We refer that we used the SSC method in RFIW'20 competition using the eight deep features concatenation. Thus, the integration of WCCN in the metric learning methods decreases the intra-class variations effect introduced by the deep features weights. We evaluate our proposed method on two kinship benchmarks namely KinFaceW-I and KinFaceW-II databases using four Parent-Child relations (Father-Son, Father-Daughter, Mother-Son and Mother-Daughter). Thus, the proposed MSIDA+WCCN method improves the SSC method with 12.80% and 14.65% on KinFaceW-I and KinFaceW-II databases, respectively. The results obtained are positively compared with some modern methods, including those that rely on deep learning.Lire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Commentaire :
ISBN 978-1-7281-3079-8
Source :
Fichiers
- http://jultika.oulu.fi/files/nbnfi-fe202103258326.pdf
- Accès libre
- Accéder au document
- https://hal.archives-ouvertes.fr/hal-03322818/document
- Accès libre
- Accéder au document
- http://arxiv.org/pdf/2006.01315
- Accès libre
- Accéder au document
- https://hal.archives-ouvertes.fr/hal-03322818/document
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- 2006.01315.pdf
- Accès libre
- Accéder au document
- 2006.01315
- Accès libre
- Accéder au document