Can Synthetic Data Handle Unconstrained Gaze Estimation ? - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Can Synthetic Data Handle Unconstrained Gaze Estimation ?

Résumé

In this article, we aim at solving unconstrained gaze estimation problem using appearance-based approach. Unlike previous methods working in relatively constrained environment , we propose an approach that allows free head motion and significant user-sensor distances using RGB-D sensor. Our paper presents the following contributions : (i) A direct estimation by inferring gaze information from RGB eyes and depth face appearances ;(ii) A channel selection strategy during the learning to evaluate the involvement of each channel in the final prediction ; (iii) Adapting a 3D face morphable model by integrating a parametric gaze model to render an important synthetic RGB-D training set. We also collect real labeled samples using Kinect sensor that allows for evaluating the potential of synthetic learning in handling real configurations and establish an objective comparison with real learning. Results on several users demonstrate the great potential of our approach.
Fichier principal
Vignette du fichier
APIA_2017_paper_3.pdf (1.98 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01561526 , version 1 (12-07-2017)

Identifiants

  • HAL Id : hal-01561526 , version 1

Citer

Amine Kacete, Renaud Séguier, Michel Collobert, Jérôme Royan. Can Synthetic Data Handle Unconstrained Gaze Estimation ?. Conférence Nationale sur les Applications Pratiques de l’Intelligence Artificielle, Jul 2017, Caen, France. ⟨hal-01561526⟩
142 Consultations
177 Téléchargements

Partager

Gmail Facebook X LinkedIn More