Uncertainty-based Human Motion Tracking with Stable Gaussian Process State Space Models

Lukas Pöhler, Jonas Umlauft, Sandra Hirche

Research output: Contribution to journalArticlepeer-review

14 Scopus citations

Abstract

Data-driven approaches are well suited to represent human motion because arbitrary complex trajectories can be captured. Gaussian process state space models allow to encode human motion while quantifying uncertainty due to missing data. Such human motion models are relevant for many application domains such as learning by demonstration and motion prediction in human-robot collaboration. For goal-directed tasks it is essential to impose stability constraints on the model representing the human motion. Motivated by learning by demonstration applications, this paper proposes an uncertainty-based control Lyapunov function approach for goal-directed path tracking. We exploit the model fidelity which is related to the location of the training and test data: Our approach actively strives into regions with more demonstration data and thus higher model certainty. This achieves accurate reproduction of the human motion independent of the initial condition and we show that generated trajectories are uniformly globally asymptotically stable. The approach is validated in a nonlinear learning by demonstration task where human-demonstrated motions are reproduced by the learned dynamical system, and higher precision than competitive state of the art methods is achieved.

Original languageEnglish
Pages (from-to)8-14
Number of pages7
JournalIFAC Proceedings Volumes (IFAC-PapersOnline)
Volume51
Issue number34
DOIs
StatePublished - 1 Jan 2019

Keywords

  • Control under uncertainty
  • Human centered automation
  • Lyapunov methods
  • Modeling of human performance
  • Nonlinear system identification
  • Path tracking

Fingerprint

Dive into the research topics of 'Uncertainty-based Human Motion Tracking with Stable Gaussian Process State Space Models'. Together they form a unique fingerprint.

Cite this