1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning

Wolfgang Fuhl, Johannes Schneider, Enkelejda Kasneci

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

In this paper we present a new approach for pupil segmentation. It can be computed and trained very efficiently, making it ideal for online use for high speed eye trackers as well as for energy saving pupil detection in mobile eye tracking. The approach is inspired by the BORE and CBF algorithms and generalizes the binary comparison by Haar features. Since these features are intrinsically very susceptible to noise and fluctuating light conditions, we combine them with conditional pupil shape probabilities. In addition, we also rank each feature according to its importance in determining the pupil shape. Another advantage of our method is the use of statistical learning, which is very efficient and can even be used online. https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/?p=%2FStatsPupilmode=list.

Original languageEnglish
Title of host publicationProceedings - 2021 IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3459-3469
Number of pages11
ISBN (Electronic)9781665401913
DOIs
StatePublished - 2021
Externally publishedYes
Event18th IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2021 - Virtual, Online, Canada
Duration: 11 Oct 202117 Oct 2021

Publication series

NameProceedings of the IEEE International Conference on Computer Vision
Volume2021-October
ISSN (Print)1550-5499

Conference

Conference18th IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2021
Country/TerritoryCanada
CityVirtual, Online
Period11/10/2117/10/21

Fingerprint

Dive into the research topics of '1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning'. Together they form a unique fingerprint.

Cite this