Towards curating personalized art exhibitions in Virtual Reality with multimodal Brain-Computer-Interfaces
Résumé
Today, we live in an age of 'Like' where appreciation of digital content is expressed constantly by interacting with feedback icons. In contrast, Brain-Computer-Interfaces (BCIs) can decode cognitive states from neural signals without explicit user feedback that interrupts aesthetic experiences (AEs). This recently started project will elucidate the neuro-cognitive mechanisms behind art appreciation and implement an Electroencephalography (EEG)-based BCI to detect physiological correlates of artwork preference in order to curate personalized art exhibitions in Virtual Reality. Most EEG recordings in visual neuroaesthetics focused on Event-Related Potentials, often using paradigms with unatural viewing conditions. On the other hand, the neural dynamics during visual art appreciation remain obscure and previous studies reported conflicting results. Furthermore, the liking of visual artworks was mostly investigated from the perspective of beauty or pleasantness, concepts which are not applicable to all aesthetic pleasures. We hypothesize instead that art preferences in general depend on rewarding AEs. Therefore, we will develop novel algorithms to decode and discriminate EEG neuromarkers of hedonic AEs. In a first step, we conceptualized neuro-cognitive components of AE, such as attention, emotion and intrinsic reward, as well as their established EEG neuromarkers. In the future, we will record EEG and other physiological measures, e.g. eye-tracking and heart rate, in naturalistic single trial experiments, use advanced Machine Learning to detect artwork preference and recommend further objects based on this multimodal information. Finally, we embrace open science and will make subject data and BCI algorithms publicly accessible.
Origine | Fichiers produits par l'(les) auteur(s) |
---|