DeepMoVIPS: Visual indoor positioning using transfer learning

Martin Werner, Carsten Hahn, Lorenz Schauer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

15 Scopus citations

Abstract

Finding the location of a mobile user is a classical and important problem in pervasive computing, because location provides a lot of information about the situation of a person from which adaptive computer systems can be created. While the inference of location outside buildings is possible with GPS or similar satellite systems, these are unavailable inside buildings. A large number of methods has been proposed to overcome this limitation and provide indoor location to mobile devices such as smartphones. With this paper, we propose a novel visual indoor positioning system DeepMoVIPS, which exploits the image classification power of deep convolutional neural networks for symbolic indoor geolocation. We further show, how to transfer visual features from deep learned networks to the application domain and give encouraging results of more than 95% classification accuracy for datasets modelling work environments using 16 rooms and evaluation over a time frame of four weeks.

Original languageEnglish
Title of host publication2016 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509024254
DOIs
StatePublished - 14 Nov 2016
Externally publishedYes
Event2016 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2016 - Madrid, Spain
Duration: 4 Oct 20167 Oct 2016

Publication series

Name2016 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2016

Conference

Conference2016 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2016
Country/TerritorySpain
CityMadrid
Period4/10/167/10/16

Fingerprint

Dive into the research topics of 'DeepMoVIPS: Visual indoor positioning using transfer learning'. Together they form a unique fingerprint.

Cite this