Paper: Environment-Aware Sensor Fusion using Deep Learning
A reliable perception pipeline is crucial to the operation of a safe and efficient autonomous vehicle. Fusing information from multiple sensors has become a common practice to increase robustness, given that different types of sensors have distinct sensing characteristics. Further, sensors can present diverse performance according to the operating environment.
Most systems rely on a rigid sensor fusion strategy which considers the sensors input only (e.g., signal and corresponding covariances), without incorporating the influence of the environment, which often causes poor performance in mixed scenarios.
In our approach, we have adjusted the sensor fusion strategy according to a classification of the scene around the vehicle. A convolutional neural network was employed to classify the environment, and this classification is used to select the best sensor configuration accordingly.
We present experiments with a full-size autonomous vehicle operating in a heterogeneous environment. The results illustrate the applicability of the method with enhanced odometry estimation when compared to a rigid sensor fusion scheme.
Silva, C.; Borges, P. and Castanho, J. (2019). Environment-aware Sensor Fusion using Deep Learning.In Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics – Volume 2: ICINCO, ISBN 978-989-758-380-3, pages 88-96. DOI: 10.5220/0007841900880096
Download the full paper here.
For more information, contact us.