Low Latency and Low-Level Sensor Fusion for Automotive Use-Cases

Matthias Pollach, Felix Schiegg, Alois Knoll

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

This work proposes a probabilistic low level automotive sensor fusion approach using LiDAR, RADAR and camera data. The method is stateless and directly operates on associated data from all sensor modalities. Tracking is not used, in order to reduce the object detection latency and create existence hypotheses per frame. The probabilistic fusion uses input from 3D and 2D space. An association method using a combination of overlap and distance metrics, avoiding the need for sensor synchronization is proposed. A Bayesian network executes the sensor fusion. The proposed approach is compared with a state of the art fusion system, which is using multiple sensors of the same modality and relies on tracking for object detection. Evaluation was done using low level sensor data recorded in an urban environment. The test results show that the low level sensor fusion reduces the object detection latency.

Original languageEnglish
Title of host publication2020 IEEE International Conference on Robotics and Automation, ICRA 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages6780-6786
Number of pages7
ISBN (Electronic)9781728173955
DOIs
StatePublished - May 2020
Event2020 IEEE International Conference on Robotics and Automation, ICRA 2020 - Paris, France
Duration: 31 May 202031 Aug 2020

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Conference

Conference2020 IEEE International Conference on Robotics and Automation, ICRA 2020
Country/TerritoryFrance
CityParis
Period31/05/2031/08/20

Keywords

  • Bayesian networks
  • object detection
  • sensor fusion

Fingerprint

Dive into the research topics of 'Low Latency and Low-Level Sensor Fusion for Automotive Use-Cases'. Together they form a unique fingerprint.

Cite this