FEETICHE: FEET Input for Contactless Hand ...
Document type :
Communication dans un congrès avec actes
DOI :
Title :
FEETICHE: FEET Input for Contactless Hand gEsture Interaction
Author(s) :
Lopes, Daniel Simões [Auteur]
Relvas, Filipe [Auteur]
Paulo, Soraia Figueiredo [Auteur]
Rekik, Yosra [Auteur]
Laboratoire d'Automatique, de Mécanique et d'Informatique industrielles et Humaines - UMR 8201 [LAMIH]
Grisoni, Laurent [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Jorge, Joaquim [Auteur]
Relvas, Filipe [Auteur]
Paulo, Soraia Figueiredo [Auteur]
Rekik, Yosra [Auteur]
Laboratoire d'Automatique, de Mécanique et d'Informatique industrielles et Humaines - UMR 8201 [LAMIH]
Grisoni, Laurent [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Jorge, Joaquim [Auteur]
Conference title :
17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry, VRCAI 2019
City :
Brisbane QLD
Country :
Australie
Start date of the conference :
2019-11-14
Publisher :
ACM
English keyword(s) :
3D manipulation
Foot interaction
Hand gestures
Heel rotation
Large screens
Selection
Tapping
Foot interaction
Hand gestures
Heel rotation
Large screens
Selection
Tapping
HAL domain(s) :
Informatique [cs]
English abstract : [en]
Foot input has been proposed to support hand gestures in many interactive contexts, however, little attention has been given contactless 3D object manipulation. This is important since many applications, namely sterile ...
Show more >Foot input has been proposed to support hand gestures in many interactive contexts, however, little attention has been given contactless 3D object manipulation. This is important since many applications, namely sterile surgical theaters require contactless operation. However, relying solely on hand gestures makes it difficult to specify precise interactions since hand movements are difficult to segment into command and interaction modes. The unfortunate results range from unintended activations, to noisy interactions and misrecognized commands. In this paper, we present FEETICHE a novel set of multi-modal interactions combining hand and foot input for supporting contactless 3D manipulation tasks, while standing in front of large displays driven by foot tapping and heel rotation. We use depth sensing cameras to capture both hand and feet gestures, and developed a simple yet robust motion capture method to track dominant foot input. Through two experiments, we assess how well foot gestures support mode switching and how this frees the hands to perform accurate manipulation tasks. Results indicate that users effectively rely on foot gestures to improve mode switching and reveal improved accuracy on both rotation and translation tasks.Show less >
Show more >Foot input has been proposed to support hand gestures in many interactive contexts, however, little attention has been given contactless 3D object manipulation. This is important since many applications, namely sterile surgical theaters require contactless operation. However, relying solely on hand gestures makes it difficult to specify precise interactions since hand movements are difficult to segment into command and interaction modes. The unfortunate results range from unintended activations, to noisy interactions and misrecognized commands. In this paper, we present FEETICHE a novel set of multi-modal interactions combining hand and foot input for supporting contactless 3D manipulation tasks, while standing in front of large displays driven by foot tapping and heel rotation. We use depth sensing cameras to capture both hand and feet gestures, and developed a simple yet robust motion capture method to track dominant foot input. Through two experiments, we assess how well foot gestures support mode switching and how this frees the hands to perform accurate manipulation tasks. Results indicate that users effectively rely on foot gestures to improve mode switching and reveal improved accuracy on both rotation and translation tasks.Show less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
Collections :
Source :