Multisensory neurofeedback design for KMI embodiment - Inria - Institut national de recherche en sciences et technologies du numérique
Conference Papers Year : 2023

Multisensory neurofeedback design for KMI embodiment

Abstract

This study describes some key steps in the design-based research process related to developing a multimodal brain-computer interface aiming to support kinesthetic motor imagery learning and rehabilitation after stroke. We highlight some of the challenges of sensory interface design to afford and embody imagined hand movement successfully. The result is a multi-sensory neurofeedback solution offering three modalities (visual, kinesthetic, and vibrotactile) to meet future users' needs and realities best. CCS CONCEPTS • Human-centered computing → User centered design; Haptic devices.
Fichier principal
Vignette du fichier
SensoryX23_Herreraetal.pdf (2.76 Mo) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-04228939 , version 1 (04-10-2023)

Licence

Identifiers

Cite

Gabriela Herrera Altamira, Nathalie Skiba, Anatole Lécuyer, Laurent Bougrain, Stéphanie Fleck. Multisensory neurofeedback design for KMI embodiment. IMX 2023 - ACM International Conference on Interactive Media Experiences, Jun 2023, Nantes, France. pp.1-3, ⟨10.1145/3604321.3604352⟩. ⟨hal-04228939⟩
100 View
109 Download

Altmetric

Share

More