Studying the Simultaneous Visual Representation of Microgestures - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2024

Studying the Simultaneous Visual Representation of Microgestures

Résumé

Hand microgestures are promising for mobile interaction with wearable devices. However, they will not be adopted if practitioners cannot communicate to users the microgestures associated with the commands of their applications. This requires unambiguous representations that simultaneously show the multiple microgestures available to control an application. Using a systematic approach, we evaluate how these representations should be designed and contrast 4 conditions depending on the microgestures (tap-swipe and tap-hold) and fingers (index and index-middle) considered. Based on the results, we design a simultaneous representation of microgestures for a given set of 14 application commands. We then evaluate the usability of the representation for novice users and the suitability of the representation for small screens compared with a baseline. Finally, we formulate 8 recommendations based on the results of all the experiments. In particular, redundant graphical and textual representations of microgestures should only be displayed for novice users.
Fichier principal
Vignette du fichier
Lambert24_MobileHCI.pdf (9.26 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04672513 , version 1 (19-08-2024)

Licence

Identifiants

Citer

Vincent Lambert, Alix Goguey, Sylvain Malacria, Laurence Nigay. Studying the Simultaneous Visual Representation of Microgestures. 2024 ACM International Conference on Mobile Human-Computer Interaction (MobileHCI 2024), Sep 2024, Melbourne, Australia. ⟨10.1145/3676523⟩. ⟨hal-04672513⟩
129 Consultations
147 Téléchargements

Altmetric

Partager

More