Fusion of Telemetric and Visual Data from Road Scenes with a Lexus Experimental Platform - Inria - Institut national de recherche en sciences et technologies du numérique
Conference Papers Year : 2011

Fusion of Telemetric and Visual Data from Road Scenes with a Lexus Experimental Platform

Abstract

Fusion of telemetric and visual data from traffic scenes helps exploit synergies between different on-board sensors, which monitor the environment around the ego-vehicle. This paper outlines our approach to sensor data fusion, detection and tracking of objects in a dynamic environment. The approach uses a Bayesian Occupancy Filter to obtain a spatio-temporal grid representation of the traffic scene. We have implemented the approach on our experimental platform on a Lexus car. The data is obtained in traffic scenes typical of urban driving, with multiple road participants. The data fusion results in a model of the dynamic environment of the ego-vehicle. The model serves for the subsequent analysis and interpretation of the traffic scene to enable collision risk estimation for improving the safety of driving.
Fichier principal
Vignette du fichier
main.pdf (294.95 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

inria-00635779 , version 1 (25-10-2011)

Identifiers

  • HAL Id : inria-00635779 , version 1

Cite

Igor Paromtchik, Mathias Perrollaz, Christian Laugier. Fusion of Telemetric and Visual Data from Road Scenes with a Lexus Experimental Platform. IEEE International Symposium on Intelligent Vehicles, Jun 2011, Baden-baden, Germany. ⟨inria-00635779⟩
238 View
338 Download

Share

More