SQUEEZEPOSENET: IMAGE BASED POSE REGRESSION with SMALL CONVOLUTIONAL NEURAL NETWORKS for REAL TIME UAS NAVIGATION

M. S. Müller, S. Urban, B. Jutzi

Research output: Contribution to journalConference articlepeer-review

15 Scopus citations

Abstract

The number of unmanned aerial vehicles (UAVs) is increasing since low-cost airborne systems are available for a wide range of users. The outdoor navigation of such vehicles is mostly based on global navigation satellite system (GNSS) methods to gain the vehicles trajectory. The drawback of satellite-based navigation are failures caused by occlusions and multi-path interferences. Beside this, local image-based solutions like Simultaneous Localization and Mapping (SLAM) and Visual Odometry (VO) can e.g. be used to support the GNSS solution by closing trajectory gaps but are computationally expensive. However, if the trajectory estimation is interrupted or not available a re-localization is mandatory. In this paper we will provide a novel method for a GNSS-free and fast image-based pose regression in a known area by utilizing a small convolutional neural network (CNN). With on-board processing in mind, we employ a lightweight CNN called SqueezeNet and use transfer learning to adapt the network to pose regression. Our experiments show promising results for GNSS-free and fast localization.

Original languageEnglish
Pages (from-to)49-57
Number of pages9
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume4
Issue number2W3
DOIs
StatePublished - 18 Aug 2017
Externally publishedYes
Event4th ISPRS International Conference on Unmanned Aerial Vehicles in Geomatics, UAV-g 2017 - Bonn, Germany
Duration: 4 Sep 20177 Sep 2017

Keywords

  • Convolutional Neural Networks
  • Image-Based
  • Navigation
  • Pose Estimation
  • UAS
  • UAV

Fingerprint

Dive into the research topics of 'SQUEEZEPOSENET: IMAGE BASED POSE REGRESSION with SMALL CONVOLUTIONAL NEURAL NETWORKS for REAL TIME UAS NAVIGATION'. Together they form a unique fingerprint.

Cite this