Visually-guided grasping while walking on a humanoid robot - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2007

Visually-guided grasping while walking on a humanoid robot

Résumé

In this paper, we apply a general framework for building complex whole-body control for highly redundant robot, and we propose to implement it for visually-guided grasping while walking on a humanoid robot. The key idea is to divide the control into several sensor-based control tasks that are simultaneously executed by a general structure called stack of tasks. This structure enables a very simple access for task sequencing, and can be used for task-level control. This framework was applied for a visual servoing task. The robot walks along a planned path, keeping the specified object in the middle of its field of view and finally, when it is close enough, the robot grasps the object while walking.
Fichier principal
Vignette du fichier
2007_icra_mansard.pdf (1.11 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

inria-00350654 , version 1 (07-01-2009)

Identifiants

  • HAL Id : inria-00350654 , version 1

Citer

Nicolas Mansard, Olivier Stasse, François Chaumette, K. Yokoi. Visually-guided grasping while walking on a humanoid robot. IEEE Int. Conf. on Robotics and Automation, ICRA'07, 2007, Roma, Italy. pp.3041-3047. ⟨inria-00350654⟩
208 Consultations
311 Téléchargements

Partager

Gmail Facebook X LinkedIn More