Studying the Visual Representation of ...
Document type :
Communication dans un congrès avec actes
DOI :
Title :
Studying the Visual Representation of Microgestures
Author(s) :
Lambert, Vincent [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Chaffangeon Caillet, Adrien [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Goguey, Alix [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Malacria, Sylvain [Auteur]
Technology and knowledge for interaction [LOKI]
Nigay, Laurence [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Laboratoire d'Informatique de Grenoble [LIG]
Chaffangeon Caillet, Adrien [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Goguey, Alix [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Malacria, Sylvain [Auteur]

Technology and knowledge for interaction [LOKI]
Nigay, Laurence [Auteur]
Laboratoire d'Informatique de Grenoble [LIG]
Conference title :
ACM International Conference on Mobile Human-Computer Interaction (MobileHCI 2023)
City :
Athens
Country :
Grèce
Start date of the conference :
2023-09-25
Book title :
Proceedings of the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI 2023)
English keyword(s) :
Microgesture
Microgesture representations
AR
Discoverability
Human-Computer Interaction
Microgesture representations
AR
Discoverability
Human-Computer Interaction
HAL domain(s) :
Informatique [cs]
English abstract : [en]
The representations of microgestures are essentials for researchers presenting their results through academic papers and system designers proposing tutorials to novice users. However, those representations remain disparate ...
Show more >The representations of microgestures are essentials for researchers presenting their results through academic papers and system designers proposing tutorials to novice users. However, those representations remain disparate and inconsistent. As a first attempt to investigate how to best graphically represent microgestures, we created 21 designs, each depicting static and dynamic versions of 4 commonly used microgestures (tap, swipe, flex and hold). We first studied these designs in a quantitative online experiment with 45 participants. We then conducted a qualitative laboratory experiment in Augmented Reality with 16 participants. Based on the results, we provide design guidelines on which elements of a microgesture should be represented and how. In particular, it is recommended to represent the actuator and the trajectory of a microgesture. Also, although preferred by users, dynamic representations are not considered better than their static counterparts for depicting a microgesture and do not necessarily result in a better user recognitionShow less >
Show more >The representations of microgestures are essentials for researchers presenting their results through academic papers and system designers proposing tutorials to novice users. However, those representations remain disparate and inconsistent. As a first attempt to investigate how to best graphically represent microgestures, we created 21 designs, each depicting static and dynamic versions of 4 commonly used microgestures (tap, swipe, flex and hold). We first studied these designs in a quantitative online experiment with 45 participants. We then conducted a qualitative laboratory experiment in Augmented Reality with 16 participants. Based on the results, we provide design guidelines on which elements of a microgesture should be represented and how. In particular, it is recommended to represent the actuator and the trajectory of a microgesture. Also, although preferred by users, dynamic representations are not considered better than their static counterparts for depicting a microgesture and do not necessarily result in a better user recognitionShow less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
ANR Project :
Collections :
Source :
Files
- document
- Open access
- Access the document
- microgestureshal.pdf
- Open access
- Access the document
- document
- Open access
- Access the document
- microgestureshal.pdf
- Open access
- Access the document