Multi-Modal Sensor Fusion and Object Tracking for Autonomous Racing

Phillip Karle, Felix Fent, Sebastian Huch, Florian Sauerbeck, Markus Lienkamp

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Reliable detection and tracking of surrounding objects are indispensable for comprehensive motion prediction and planning of autonomous vehicles. Due to the limitations of individual sensors, the fusion of multiple sensor modalities is required to improve the overall detection capabilities. Additionally, robust motion tracking is essential for reducing the effect of sensor noise and improving state estimation accuracy. The reliability of the autonomous vehicle software becomes even more relevant in complex, adversarial high-speed scenarios at the vehicle handling limits in autonomous racing. In this paper, we present a modular multi-modal sensor fusion and tracking method for high-speed applications. The method is based on the Extended Kalman Filter (EKF) and is capable of fusing heterogeneous detection inputs to track surrounding objects consistently. A novel delay compensation approach enables to reduce the influence of the perception software latency and to output an updated object list. It is the first fusion and tracking method validated in high-speed real-world scenarios at the Indy Autonomous Challenge 2021 and the Autonomous Challenge at CES (AC@CES) 2022, proving its robustness and computational efficiency on embedded systems. It does not require any labeled data and achieves position tracking residuals below 0.1 m.

Original languageEnglish
Pages (from-to)3871-3883
Number of pages13
JournalIEEE Transactions on Intelligent Vehicles
Volume8
Issue number7
DOIs
StatePublished - 1 Jul 2023

Keywords

  • Autonomous vehicles
  • data association
  • extended Kalman filter
  • multi-object tracking (MOT)
  • sensor fusion

Fingerprint

Dive into the research topics of 'Multi-Modal Sensor Fusion and Object Tracking for Autonomous Racing'. Together they form a unique fingerprint.

Cite this