U-HAR

Johannes Meyer, Adrian Frank, Thomas Schlebusch, Enkelejda Kasneci

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

6 Zitate (Scopus)

Abstract

After the success of smartphones and smartwatches, smart glasses are expected to be the next smart wearable. While novel display technology allows the seamlessly embedding of content into the FOV, interaction methods with glasses, requiring the user for active interaction, limiting the user experience. One way to improve this and drive immersive augmentation is to reduce user interactions to a necessary minimum by adding context awareness to smart glasses. For this, we propose an approach based on human activity recognition, which incorporates features, derived from the user's head- and eye-movement. Towards this goal, we combine an commercial eye-tracker and an IMU to capture eye- and head-movement features of 7 activities performed by 20 participants. From a methodological perspective, we introduce U-HAR, a convolutional network optimized for activity recognition. By applying a few-shot learning, our model reaches an macro-F1-score of 86.59%, allowing us to derive contextual information.

OriginalspracheEnglisch
Aufsatznummer143
FachzeitschriftProceedings of the ACM on Human-Computer Interaction
Jahrgang6
AusgabenummerETRA
DOIs
PublikationsstatusVeröffentlicht - Mai 2022
Extern publiziertJa

Fingerprint

Untersuchen Sie die Forschungsthemen von „U-HAR“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren