Comparing Input Modalities for Peripheral Interaction: A Case Study on Peripheral Music Control
Résumé
In graphical user interfaces, every application usually asks for the user’s full attention during interaction with it. Even marginal side activities often force the user to switch windows, which results in attention shifts and increased cognitive load. Peripheral interaction addresses this problem by providing input facilities in the periphery of the user’s attention by relying on divided attention and human capabilities such as proprioception and spatial memory. Recent work shows promising results by shifting tasks to the periphery for parallel task execution. Up to now, most of these interfaces rely on tag-based objects, tokens or wearable devices, which need to be grasped and manipulated, e.g., by turning, moving or pressing the device.To explore this design space further, we implemented three modalities for peripheral interaction with a desktop audio player application – graspable interaction, touch and freehand gestures. In an eight-week in-situ deployment, we compared the three modalities to each other and to media keys (as the state-of-the-art approach). We found that all modalities can be successfully used in the (visual and attentional) periphery and reduce the amount of cognitive load when interacting with an audio player.With this work we intend to (1) illustrate the variety of possible modalities beyond graspable interfaces, (2) give insights on manual peripheral interaction in general and the respective modalities in particular and (3) elaborate on paper based prototypes for the evaluation of peripheral interaction.
Domaines
Informatique [cs]Origine | Fichiers produits par l'(les) auteur(s) |
---|