Paper: Environment-Aware Sensor Fusion using Deep Learning

August 20th, 2020

A reliable perception pipeline is crucial to the operation of a safe and efficient autonomous vehicle. Fusing information from multiple sensors has become a common practice to increase robustness, given that different types of sensors have distinct sensing characteristics. Further, sensors can present diverse performance according to the operating environment.

Most systems rely on a rigid sensor fusion strategy which considers the sensors input only (e.g., signal and corresponding covariances), without incorporating the influence of the environment, which often causes poor performance in mixed scenarios.

In our approach, we have adjusted the sensor fusion strategy according to a classification of the scene around the vehicle. A convolutional neural network was employed to classify the environment, and this classification is used to select the best sensor configuration accordingly.

We present experiments with a full-size autonomous vehicle operating in a heterogeneous environment. The results illustrate the applicability of the method with enhanced odometry estimation when compared to a rigid sensor fusion scheme.

Fig. 1: Satellite view illustrating a heterogeneous operation space. The red path were used to train the CNN to classify the environment, while the white was used to validate its performance. Image from Google Maps.

Silva, C.; Borges, P. and Castanho, J. (2019). Environment-aware Sensor Fusion using Deep Learning.In Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics – Volume 2: ICINCO, ISBN 978-989-758-380-3, pages 88-96. DOI: 10.5220/0007841900880096

Download the full paper here.

For more information, contact us.

Subscribe to our News via Email

Enter your email address to subscribe and receive notifications of new posts by email.