When Computers Decode your Social Intention
Type de document :
Communication dans un congrès avec actes
DOI :
Titre :
When Computers Decode your Social Intention
Auteur(s) :
Desrosiers, Paul Audain [Auteur]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Daoudi, Mohamed [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Coello, Yann [Auteur]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Daoudi, Mohamed [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Coello, Yann [Auteur]
Laboratoire Sciences Cognitives et Sciences Affectives - UMR 9193 [SCALab]
Titre de la manifestation scientifique :
14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019)
Ville :
Lille
Pays :
France
Date de début de la manifestation scientifique :
2019-05-14
Éditeur :
IEEE
Date de publication :
2019-05-01
Discipline(s) HAL :
Sciences cognitives
Résumé en anglais : [en]
In this demo session, we will propose our framework that is based on our paper [1] . In real time, we proposed to analyze the trajectories of the human arm to predict social intention (personal or social intention). The ...
Lire la suite >In this demo session, we will propose our framework that is based on our paper [1] . In real time, we proposed to analyze the trajectories of the human arm to predict social intention (personal or social intention). The trajectories of different 3D markers acquired by Mocap system are defined in shape spaces of open curves, thus analyze in a Riemannian manifold. The results obtained in the experiments on a new dataset show an average recognition of about 68% for the proposed method, which is comparable with the average score produced by human evaluation. The experimental results show also that the classification rate could be used to improve social communication between human and virtual agents. To the best of our knowledge, this is the first demo in real time, which uses computer vision techniques to analyze the effect of social intention on motor action for improving the social communication between human and avatar. The main goal is to categorize the user intention among two classes denote {personal, social}. This experimentation contains 3 parts: a) data acquisitions and a learning step; b) classification; c) Kinematic analysis of the evolution of subjects to interact with the avatar. To successfully drive our study, all the using scripts are writing under Matlab and C/C++. Then the using equipments are: 1) Qualisys motion capture camera (qualisys system). The qualisys system is delivered with a desk computer with 8 GB, a processor Intel core i7-4770k (8 CPUs) at 3.5 GHz. The frequency of those cameras can varies from 100 to 500 Hz. A black glove equipped with infrared reflective markers, all those equipments are also provided by qualisys system. 2) A Matlab software (version R2014a) installed on a desk computer (qualisys system); the Qualisys system provide a specific driver that allow to couple all the Matlab scripts with their system. Thus, it is possible to command all the cameras directly from Matlab for real time analysis, see Fig. 1 .Lire moins >
Lire la suite >In this demo session, we will propose our framework that is based on our paper [1] . In real time, we proposed to analyze the trajectories of the human arm to predict social intention (personal or social intention). The trajectories of different 3D markers acquired by Mocap system are defined in shape spaces of open curves, thus analyze in a Riemannian manifold. The results obtained in the experiments on a new dataset show an average recognition of about 68% for the proposed method, which is comparable with the average score produced by human evaluation. The experimental results show also that the classification rate could be used to improve social communication between human and virtual agents. To the best of our knowledge, this is the first demo in real time, which uses computer vision techniques to analyze the effect of social intention on motor action for improving the social communication between human and avatar. The main goal is to categorize the user intention among two classes denote {personal, social}. This experimentation contains 3 parts: a) data acquisitions and a learning step; b) classification; c) Kinematic analysis of the evolution of subjects to interact with the avatar. To successfully drive our study, all the using scripts are writing under Matlab and C/C++. Then the using equipments are: 1) Qualisys motion capture camera (qualisys system). The qualisys system is delivered with a desk computer with 8 GB, a processor Intel core i7-4770k (8 CPUs) at 3.5 GHz. The frequency of those cameras can varies from 100 to 500 Hz. A black glove equipped with infrared reflective markers, all those equipments are also provided by qualisys system. 2) A Matlab software (version R2014a) installed on a desk computer (qualisys system); the Qualisys system provide a specific driver that allow to couple all the Matlab scripts with their system. Thus, it is possible to command all the cameras directly from Matlab for real time analysis, see Fig. 1 .Lire moins >
Langue :
Anglais
Audience :
Internationale
Vulgarisation :
Non
Collections :
Source :