Understanding Gesture Articulations Variability
Type de document :
Communication dans un congrès avec actes
URL permanente :
Titre :
Understanding Gesture Articulations Variability
Auteur(s) :
Erazo, Orlando [Auteur]
Universidad Técnica Estatal de Quevedo [UTEQ]
Rekik, yosra [Auteur]
Methods and tools for gestural interactions [MINT]
Grisoni, Laurent [Auteur]
Methods and tools for gestural interactions [MINT]
Pino, José [Auteur]
Universidad de Chile = University of Chile [Santiago] [UCHILE]
Universidad Técnica Estatal de Quevedo [UTEQ]
Rekik, yosra [Auteur]
Methods and tools for gestural interactions [MINT]
Grisoni, Laurent [Auteur]
Methods and tools for gestural interactions [MINT]
Pino, José [Auteur]
Universidad de Chile = University of Chile [Santiago] [UCHILE]
Éditeur(s) ou directeur(s) scientifique(s) :
Regina Bernhaupt
Girish Dalvi
Anirudha Joshi
Devanuj K. Balkrishan
Jacki O'Neill
Marco Winckler
Girish Dalvi
Anirudha Joshi
Devanuj K. Balkrishan
Jacki O'Neill
Marco Winckler
Titre de la manifestation scientifique :
16th IFIP Conference on Human-Computer Interaction (INTERACT)
Ville :
Bombay
Pays :
Inde
Date de début de la manifestation scientifique :
2017-09-25
Titre de l’ouvrage :
Lecture Notes in Computer Science
Titre de la revue :
Human-Computer Interaction - INTERACT 2017
Éditeur :
Springer International Publishing
Date de publication :
2017
Mot(s)-clé(s) en anglais :
Mid-air gestures
Whole body gestures
Gesture articulation
Gesture variability
Gesture taxonomy
Whole body gestures
Gesture articulation
Gesture variability
Gesture taxonomy
Discipline(s) HAL :
Informatique [cs]
Résumé en anglais : [en]
Interfaces based on mid-air gestures often use a one-to-one mapping between gestures and commands, but most remain very basic. Actually, people exhibit inherent intrinsic variations for their gesture articulations because ...
Lire la suite >Interfaces based on mid-air gestures often use a one-to-one mapping between gestures and commands, but most remain very basic. Actually, people exhibit inherent intrinsic variations for their gesture articulations because gestures carry dependency with both the person producing them and the specific context, social or cultural, in which they are being produced. We advocate that allowing applications to map many gestures to one command is a key step to give more flexibility, avoid penalizations, and lead to better user interaction experiences. Accordingly, this paper presents our results on mid-air gesture variability. We are mainly concerned with understanding variability in mid-air gesture articulations from a pure user-centric perspective. We describe a comprehensive investigation on how users vary the production of gestures under unconstrained articulation conditions. The conducted user study consisted in two tasks. The first one provides a model of user conception and production of gestures; from this study we also derive an embodied taxonomy of gestures. This taxonomy is used as a basis for the second experiment, in which we perform a fine grain quantitative analysis of gesture articulation variability. Based on these results, we discuss implications for gesture interface designs.Lire moins >
Lire la suite >Interfaces based on mid-air gestures often use a one-to-one mapping between gestures and commands, but most remain very basic. Actually, people exhibit inherent intrinsic variations for their gesture articulations because gestures carry dependency with both the person producing them and the specific context, social or cultural, in which they are being produced. We advocate that allowing applications to map many gestures to one command is a key step to give more flexibility, avoid penalizations, and lead to better user interaction experiences. Accordingly, this paper presents our results on mid-air gesture variability. We are mainly concerned with understanding variability in mid-air gesture articulations from a pure user-centric perspective. We describe a comprehensive investigation on how users vary the production of gestures under unconstrained articulation conditions. The conducted user study consisted in two tasks. The first one provides a model of user conception and production of gestures; from this study we also derive an embodied taxonomy of gestures. This taxonomy is used as a basis for the second experiment, in which we perform a fine grain quantitative analysis of gesture articulation variability. Based on these results, we discuss implications for gesture interface designs.Lire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Commentaire :
Part 4: Information on Demand, on the Move, and Gesture Interaction
Collections :
Source :
Date de dépôt :
2022-06-12T02:12:44Z
Fichiers
- https://hal.inria.fr/hal-01678464/document
- Accès libre
- Accéder au document