HandyNotes: using the hands to create semantic representations of contextually aware real-world objects - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2024

HandyNotes: using the hands to create semantic representations of contextually aware real-world objects

Abstract

This paper uses Mixed Reality (MR) technologies to provide a seamless integration of digital information in physical environments through human-made annotations. Creating digital annotations of physical objects evokes many challenges for performing (simple) tasks such as adding digital notes and connecting them to real-world objects. For that, we have developed an MR system using the Microsoft HoloLens2 to create semantic representations of contextually-aware real-world objects while interacting with holographic virtual objects. User interaction is enhanced with use of fingers as placeholders for menu items. We demonstrate our approach through two real-world scenarios. We also discuss the challenges for using MR technologies.
Fichier principal
Vignette du fichier
2024015132.pdf (3.34 Mo) Télécharger le fichier
Origin Files produced by the author(s)
Licence
Copyright

Dates and versions

hal-04425616 , version 1 (30-01-2024)

Licence

Copyright

Identifiers

  • HAL Id : hal-04425616 , version 1

Cite

Clément Quere, Aline Menin, Raphaël Julien, Hui-Yin Wu, Marco Winckler. HandyNotes: using the hands to create semantic representations of contextually aware real-world objects. IEEE VR 2024 - The 31st IEEE Conference on Virtual Reality and 3D User Interfaces, Mar 2024, Orlando, Florida, United States. ⟨hal-04425616⟩
151 View
44 Download

Share

Gmail Mastodon Facebook X LinkedIn More