Robust model-based tracking for robot vision
Abstract
This paper proposes a real-time, robust and efficient 3D model-based tracking algorithm for visual servoing. A virtual visual servoing approach is used for monocular 3D tracking. This method is similar to more classical non-linear pose computation techniques. A concise method for derivation of efficient distance-to-contour interaction matrices is described. An oriented edge detector is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating a M-estimator into the virtual visual control law via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several 2D 1/2 visual servoing experiments considering various objects. Results show the method to be robust to occlusion, changes in illumination and miss-tracking.
Domains
Robotics [cs.RO]
Origin : Files produced by the author(s)
Loading...