Pruning CNNs for LiDAR-based Perception in Resource Constrained Environments

Manoj Rohit Vemparala, Anmol Singh, Ahmed Mzid, Nael Fasfous, Alexander Frickenstein, Florain Mirus, Hans Joerg Voegel, Naveen Shankar Nagaraja, Walter Stechele

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

Deep neural networks provide high accuracy for perception. However they require high computational power. In particular, LiDAR-based object detection delivers good accuracy and real-time performance, but demands high computation due to expensive feature-extraction from point cloud data in the encoder and backbone networks. We investigate the model complexity versus accuracy trade-off using reinforcement learning based pruning for PointPillars, a recent LiDAR-based 3D object detection network. We evaluate the model on the validation dataset of KITTI (80/20-splits) according to the mean average precision (mAP) for the car class. We prune the original PointPillars model (mAP 89.84) and achieve 65.8% reduction in floating point operations (FLOPs) for a marginal accuracy loss. The compression corresponds to 31.7% reduction in inference time and 35% reduction in GPU memory on GTX 1080 Ti.

Original languageEnglish
Title of host publication2021 IEEE Intelligent Vehicles Symposium Workshops, IV Workshops 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages228-235
Number of pages8
ISBN (Electronic)9781665479219
DOIs
StatePublished - 2021
Event32nd IEEE Intelligent Vehicles Symposium Workshops, IV Workshops 2021 - Nagoya, Japan
Duration: 11 Jul 202117 Jul 2021

Publication series

NameIEEE Intelligent Vehicles Symposium, Proceedings

Conference

Conference32nd IEEE Intelligent Vehicles Symposium Workshops, IV Workshops 2021
Country/TerritoryJapan
CityNagoya
Period11/07/2117/07/21

Fingerprint

Dive into the research topics of 'Pruning CNNs for LiDAR-based Perception in Resource Constrained Environments'. Together they form a unique fingerprint.

Cite this