U-HAR

Johannes Meyer, Adrian Frank, Thomas Schlebusch, Enkelejda Kasneci

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

After the success of smartphones and smartwatches, smart glasses are expected to be the next smart wearable. While novel display technology allows the seamlessly embedding of content into the FOV, interaction methods with glasses, requiring the user for active interaction, limiting the user experience. One way to improve this and drive immersive augmentation is to reduce user interactions to a necessary minimum by adding context awareness to smart glasses. For this, we propose an approach based on human activity recognition, which incorporates features, derived from the user's head- and eye-movement. Towards this goal, we combine an commercial eye-tracker and an IMU to capture eye- and head-movement features of 7 activities performed by 20 participants. From a methodological perspective, we introduce U-HAR, a convolutional network optimized for activity recognition. By applying a few-shot learning, our model reaches an macro-F1-score of 86.59%, allowing us to derive contextual information.

Original languageEnglish
Article number143
JournalProceedings of the ACM on Human-Computer Interaction
Volume6
Issue numberETRA
DOIs
StatePublished - May 2022
Externally publishedYes

Keywords

  • context awareness
  • head and eye movements
  • human activity recognition
  • smart glasses
  • ubiquitous computing

Fingerprint

Dive into the research topics of 'U-HAR'. Together they form a unique fingerprint.

Cite this