TY - JOUR
T1 - Robust Autonomous Vehicle Pursuit Without Expert Steering Labels
AU - Pan, Jiaxin
AU - Zhou, Changyao
AU - Gladkova, Mariia
AU - Khan, Qadeer
AU - Cremers, Daniel
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2023/10/1
Y1 - 2023/10/1
N2 - In this work, we present a learning method for both lateral and longitudinal motion control of an ego-vehicle for the task of vehicle pursuit. The car being controlled does not have a pre-defined route, rather it reactively adapts to follow a target vehicle while maintaining a safety distance. To train our model, we do not rely on steering labels recorded from an expert driver, but effectively leverage a classical controller as an offline label generation tool. In addition, we account for the errors in the predicted control values, which can lead to a loss of tracking and catastrophic crashes of the controlled vehicle. To this end, we propose an effective data augmentation approach, which allows to train a network that is capable of handling different views of the target vehicle. During the pursuit, the target vehicle is firstly localized using a Convolutional Neural Network. The network takes a single RGB image along with cars' velocities and estimates target vehicle's pose with respect to the ego-vehicle. This information is then fed to a Multi-Layer Perceptron, which regresses the control commands for the ego-vehicle, namely throttle and steering angle. We extensively validate our approach using the CARLA simulator on a wide range of terrains. Our method demonstrates real-time performance, robustness to different scenarios including unseen trajectories and high route completion.
AB - In this work, we present a learning method for both lateral and longitudinal motion control of an ego-vehicle for the task of vehicle pursuit. The car being controlled does not have a pre-defined route, rather it reactively adapts to follow a target vehicle while maintaining a safety distance. To train our model, we do not rely on steering labels recorded from an expert driver, but effectively leverage a classical controller as an offline label generation tool. In addition, we account for the errors in the predicted control values, which can lead to a loss of tracking and catastrophic crashes of the controlled vehicle. To this end, we propose an effective data augmentation approach, which allows to train a network that is capable of handling different views of the target vehicle. During the pursuit, the target vehicle is firstly localized using a Convolutional Neural Network. The network takes a single RGB image along with cars' velocities and estimates target vehicle's pose with respect to the ego-vehicle. This information is then fed to a Multi-Layer Perceptron, which regresses the control commands for the ego-vehicle, namely throttle and steering angle. We extensively validate our approach using the CARLA simulator on a wide range of terrains. Our method demonstrates real-time performance, robustness to different scenarios including unseen trajectories and high route completion.
KW - Deep learning methods
KW - motion control
KW - visual tracking
UR - http://www.scopus.com/inward/record.url?scp=85168743086&partnerID=8YFLogxK
U2 - 10.1109/LRA.2023.3308060
DO - 10.1109/LRA.2023.3308060
M3 - Article
AN - SCOPUS:85168743086
SN - 2377-3766
VL - 8
SP - 6595
EP - 6602
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 10
ER -