Distributed Accelerometers for Gesture Recognition and Visualization
Résumé
Acceleration information captured from inertial sensors on the hand can provide valuable information of its 3D angular pose. From this we can recognize hand gestures and visualize them. The applications for this technology range from touchless human-machine interfaces to aiding gesture communication for humans not familiar with sign language. The development of silicon chip manufacture allowed these sensors to fit in the top of a nail or implanted in the skin and still wirelessly communicate to a processing unit. Our work demonstrates that it is possible to have gesture recognition from a clutter-free system by wearing very small devices and connect them to a nearby processing unit. This work will focus on the processing of the acceleration information. Methods are shown to estimate hand pose, finger joints’ position, and from that recognize gestures. A visualization of the angular pose of the hand is also presented. This visualization can show a single render of the pose of a recognized gesture or it can provide a simple real-time (low-latency) rendering of the hand pose. The processing of the acceleration information uses the gravity acceleration as a vertical reference.
Domaines
Informatique [cs]Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...