Multi-sensors data fusion using dynamic bayesian network for robotised vehicle geo-localisation
Résumé
This paper presents an outdoor geolocalisation method, which integrates several information sources: measurements from GPS, incremental encoders and gyroscope, 2D images provided by an on-board camera and a virtual 3D city model. A 3D cartographical observation of the vehicle pose is constructed. This observation is based on the matching between the acquired 2D images and the virtual 3D city model. This estimation is especially useful during long GPS outages to correct the drift of the only dead-reckoning localisation or when the GPS quality is deteriorated due to multi-path, satellites masks and so on particularly in urban environments. Moreover, the various sensors measurements are fused in Dynamic Bayesian Network formalism in order to provide a continuous estimation of the pose.