Facial Action Unit Detection using 3D Face ...
Document type :
Communication dans un congrès avec actes
Title :
Facial Action Unit Detection using 3D Face Landmarks for Pain Detection
Author(s) :
Feghoul, Kevin [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Lille Neurosciences & Cognition - U 1172 [LilNCog]
Bouazizi, Mondher [Auteur]
Faculty of Science and Technology [Tokyo, Keio University]
Santana, Deise [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Lille Neurosciences & Cognition - U 1172 [LilNCog]
Bouazizi, Mondher [Auteur]
Faculty of Science and Technology [Tokyo, Keio University]
Santana, Deise [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Conference title :
45th Annual International Conference of the IEEE Engineering in Medicine and Biology Society
City :
Sydney
Country :
Australie
Start date of the conference :
2023-07-24
HAL domain(s) :
Informatique [cs]/Vision par ordinateur et reconnaissance de formes [cs.CV]
English abstract : [en]
Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, ...
Show more >Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in automated AU detection can greatly reduce the time required for this task and improve the reliability of annotations for downstream tasks, such as pain detection. In this study, we present an efficient method for detecting AUs using only 3D face landmarks. Using the detected AUs, we trained state-ofthe-art deep learning models to detect pain, which validates the effectiveness of the AU detection model. Our study also establishes a new benchmark for pain detection on the BP4D+ dataset, demonstrating an 11.13% improvement in F1-score and a 3.09% improvement in accuracy using a Transformer model compared to existing studies. Our results show that utilizing only eight predicted AUs still achieves competitive results when compared to using all 34 ground-truth AUs.Show less >
Show more >Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in automated AU detection can greatly reduce the time required for this task and improve the reliability of annotations for downstream tasks, such as pain detection. In this study, we present an efficient method for detecting AUs using only 3D face landmarks. Using the detected AUs, we trained state-ofthe-art deep learning models to detect pain, which validates the effectiveness of the AU detection model. Our study also establishes a new benchmark for pain detection on the BP4D+ dataset, demonstrating an 11.13% improvement in F1-score and a 3.09% improvement in accuracy using a Transformer model compared to existing studies. Our results show that utilizing only eight predicted AUs still achieves competitive results when compared to using all 34 ground-truth AUs.Show less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
Collections :
Source :
Files
- document
- Open access
- Access the document
- EMBC_final.pdf
- Open access
- Access the document
- document
- Open access
- Access the document
- EMBC_final.pdf
- Open access
- Access the document