Semantic Texture for Robust Dense Tracking

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Scopus citations

Abstract

We argue that robust dense SLAM systems can make valuable use of the layers of features coming from a standard CNN as a pyramid of 'semantic texture' which is suitable for dense alignment while being much more robust to nuisance factors such as lighting than raw RGB values. We use a straightforward Lucas-Kanade formulation of image alignment, with a schedule of iterations over the coarse-to-fine levels of a pyramid, and simply replace the usual image pyramid by the hierarchy of convolutional feature maps from a pre-trained CNN. The resulting dense alignment performance is much more robust to lighting and other variations, as we show by camera rotation tracking experiments on time-lapse sequences captured over many hours. Looking towards the future of scene representation for real-time visual SLAM, we further demonstrate that a selection using simple criteria of a small number of the total set of features output by a CNN gives just as accurate but much more efficient tracking performance.

Original languageEnglish
Title of host publicationProceedings - 2017 IEEE International Conference on Computer Vision Workshops, ICCVW 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages851-859
Number of pages9
ISBN (Electronic)9781538610343
DOIs
StatePublished - 19 Jan 2018
Externally publishedYes
Event16th IEEE International Conference on Computer Vision Workshops, ICCVW 2017 - Venice, Italy
Duration: 22 Oct 201729 Oct 2017

Publication series

NameProceedings - 2017 IEEE International Conference on Computer Vision Workshops, ICCVW 2017
Volume2018-January

Conference

Conference16th IEEE International Conference on Computer Vision Workshops, ICCVW 2017
Country/TerritoryItaly
CityVenice
Period22/10/1729/10/17

Fingerprint

Dive into the research topics of 'Semantic Texture for Robust Dense Tracking'. Together they form a unique fingerprint.

Cite this