A New Dense Hybrid Stereo Visual Odometry Approach - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

A New Dense Hybrid Stereo Visual Odometry Approach

Résumé

Visual odometry is an important part of the perception module of autonomous robots. Recent advances in deep learning approaches have given rise to hybrid visual odometry approaches that combine both deep networks and traditional pose estimation methods. One limitation of deep learning approaches is the availability of ground truth data needed to train the neural networks. For example, it is extremely difficult, if not impossible, to obtain a ground truth dense depth map of the environment to be used for stereo visual odometry. Even if unsupervised training of networks has been investigated, supervised training remains more reliable and robust. In this paper, we propose a new hybrid dense stereo visual odometry approach in which a dense depth map is obtained with a network that is supervised using ground truth poses that can be more easily obtained than ground truth depths maps. The depth map obtained from the neural network is used to warp the current image into the reference frame and the optimal pose is obtained by minimizing a cost function that encodes the similarity between the warped image and the reference image. The experimental results show that the proposed approach, not only improves state-of-theart depth maps estimation networks on some of the standard benchmark datasets, but also outperforms the state-of-the-art visual odometry methods.
Fichier principal
Vignette du fichier
_paper_iros22_v2 (2).pdf (1.71 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03935158 , version 1 (11-01-2023)

Identifiants

Citer

Ziming Liu, Ezio Malis, Philippe Martinet. A New Dense Hybrid Stereo Visual Odometry Approach. IROS 2022 - 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct 2022, KYOTO, Japan. pp.6998-7003, ⟨10.1109/IROS47612.2022.9981814⟩. ⟨hal-03935158⟩
47 Consultations
240 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More