TY - JOUR
T1 - Event-Based Neuromorphic Vision for Autonomous Driving
T2 - A Paradigm Shift for Bio-Inspired Visual Sensing and Perception
AU - Chen, Guang
AU - Cao, Hu
AU - Conradt, Jorg
AU - Tang, Huajin
AU - Rohrbein, Florian
AU - Knoll, Alois
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - As a bio-inspired and emerging sensor, an event-based neuromorphic vision sensor has a different working principle compared to the standard frame-based cameras, which leads to promising properties of low energy consumption, low latency, high dynamic range (HDR), and high temporal resolution. It poses a paradigm shift to sense and perceive the environment by capturing local pixel-level light intensity changes and producing asynchronous event streams. Advanced technologies for the visual sensing system of autonomous vehicles from standard computer vision to event-based neuromorphic vision have been developed. In this tutorial-like article, a comprehensive review of the emerging technology is given. First, the course of the development of the neuromorphic vision sensor that is derived from the understanding of biological retina is introduced. The signal processing techniques for event noise processing and event data representation are then discussed. Next, the signal processing algorithms and applications for event-based neuromorphic vision in autonomous driving and various assistance systems are reviewed. Finally, challenges and future research directions are pointed out. It is expected that this article will serve as a starting point for new researchers and engineers in the autonomous driving field and provide a bird's-eye view to both neuromorphic vision and autonomous driving research communities.
AB - As a bio-inspired and emerging sensor, an event-based neuromorphic vision sensor has a different working principle compared to the standard frame-based cameras, which leads to promising properties of low energy consumption, low latency, high dynamic range (HDR), and high temporal resolution. It poses a paradigm shift to sense and perceive the environment by capturing local pixel-level light intensity changes and producing asynchronous event streams. Advanced technologies for the visual sensing system of autonomous vehicles from standard computer vision to event-based neuromorphic vision have been developed. In this tutorial-like article, a comprehensive review of the emerging technology is given. First, the course of the development of the neuromorphic vision sensor that is derived from the understanding of biological retina is introduced. The signal processing techniques for event noise processing and event data representation are then discussed. Next, the signal processing algorithms and applications for event-based neuromorphic vision in autonomous driving and various assistance systems are reviewed. Finally, challenges and future research directions are pointed out. It is expected that this article will serve as a starting point for new researchers and engineers in the autonomous driving field and provide a bird's-eye view to both neuromorphic vision and autonomous driving research communities.
UR - http://www.scopus.com/inward/record.url?scp=85087826415&partnerID=8YFLogxK
U2 - 10.1109/MSP.2020.2985815
DO - 10.1109/MSP.2020.2985815
M3 - Article
AN - SCOPUS:85087826415
SN - 1053-5888
VL - 37
SP - 34
EP - 49
JO - IEEE Signal Processing Magazine
JF - IEEE Signal Processing Magazine
IS - 4
M1 - 9129849
ER -