Human Motion Tracking with a Kinematic Parametrization of Extremal Contours
Résumé
This report addresses the problem of human motion tracking from image sequences. The human body is described by an articulated mechanical chain and human body-parts are described by volumetric primitives with curved surfaces. An extremal contour appears in an image whenever a curved surface turns smoothly away from the viewer. We describe a method that relies on a kinematic parameterization of such extremal contours. The apparent motion of these contours in the image plane is a function of both the rigid motion of the surface and the relative position and orientation of the viewer with respect to the curved surface. First, we describe a parameterization of an extremal-contour point, and its associated image velocity, as a function of the motion parameters of the kinematic chain associated with the human body. Second, we introduce the zero-reference kinematic model and we show how it may be used for human-motion modelling. Third, we show how the chamfer-distance may be used to measure the discrepancy between predicted extremal contours and observed image contours; Moreover we show how the chamfer distance can be used as a differentiable multi-valued function and how the tracker based on this distance can be cast in an optimization framework. Fourth, we describe a practical human-body tracker that may use an arbitrary number of cameras. One great methodological and practical advantage of our method is that it relies neither on model-to-image, nor on image-to-image point matches. In practice we model people with 5 kinematic chains, 19 volumetric primitives, and 54 degrees of freedom; We observe silhouettes in images gathered with several synchronized and calibrated cameras. The tracker has been successfully applied to several complex motions gathered at 30 frames/second.