Facial Action Unit Detection using 3D Face ...
Type de document :
Communication dans un congrès avec actes
Titre :
Facial Action Unit Detection using 3D Face Landmarks for Pain Detection
Auteur(s) :
Feghoul, Kevin [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Lille Neurosciences & Cognition - U 1172 [LilNCog]
Bouazizi, Mondher [Auteur]
Faculty of Science and Technology [Tokyo, Keio University]
Santana, Deise [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Lille Neurosciences & Cognition - U 1172 [LilNCog]
Bouazizi, Mondher [Auteur]
Faculty of Science and Technology [Tokyo, Keio University]
Santana, Deise [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Titre de la manifestation scientifique :
45th Annual International Conference of the IEEE Engineering in Medicine and Biology Society
Ville :
Sydney
Pays :
Australie
Date de début de la manifestation scientifique :
2023-07-24
Discipline(s) HAL :
Informatique [cs]/Vision par ordinateur et reconnaissance de formes [cs.CV]
Résumé en anglais : [en]
Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, ...
Lire la suite >Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in automated AU detection can greatly reduce the time required for this task and improve the reliability of annotations for downstream tasks, such as pain detection. In this study, we present an efficient method for detecting AUs using only 3D face landmarks. Using the detected AUs, we trained state-ofthe-art deep learning models to detect pain, which validates the effectiveness of the AU detection model. Our study also establishes a new benchmark for pain detection on the BP4D+ dataset, demonstrating an 11.13% improvement in F1-score and a 3.09% improvement in accuracy using a Transformer model compared to existing studies. Our results show that utilizing only eight predicted AUs still achieves competitive results when compared to using all 34 ground-truth AUs.Lire moins >
Lire la suite >Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in automated AU detection can greatly reduce the time required for this task and improve the reliability of annotations for downstream tasks, such as pain detection. In this study, we present an efficient method for detecting AUs using only 3D face landmarks. Using the detected AUs, we trained state-ofthe-art deep learning models to detect pain, which validates the effectiveness of the AU detection model. Our study also establishes a new benchmark for pain detection on the BP4D+ dataset, demonstrating an 11.13% improvement in F1-score and a 3.09% improvement in accuracy using a Transformer model compared to existing studies. Our results show that utilizing only eight predicted AUs still achieves competitive results when compared to using all 34 ground-truth AUs.Lire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- EMBC_final.pdf
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- EMBC_final.pdf
- Accès libre
- Accéder au document