Integratives Konzept zur prototypischen Implementierung multimodaler Benutzerschnittstellen

Translated title of the contribution: Integrative rapid-prototyping for multimodal user interfaces

B. W. Schuller, M. K. Lang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

The aim of the presented work is to build a generic human computer interface construction kit for potential future applications. The concept permits rapid-prototyping or even final implementations with either a basic set of interaction concepts or adapted solutions. Multimodal control is an integrative factor. Basic input modalities are natural speech and haptic control via touch gestures or conventional devices. Output is basically given on a two-dimensional display plus speech. Future modalities can be connected by a defined communication protocol. An interface description language called IDL helps to build an interface-, an intention and an output model for an interface. A special feature is the ability to profile the user and integrate contextual system and external knowledge online. The achieved data is used to constrain the hypotheses sphere of the recognition instances and provides more robust recognition results. First interfaces have been produced at our institute and proven robust. The high acceptance among first test persons claimed for further research in this area.

Translated title of the contributionIntegrative rapid-prototyping for multimodal user interfaces
Original languageGerman
Pages (from-to)279-284
Number of pages6
JournalVDI Berichte
Issue number1678
StatePublished - 2002

Fingerprint

Dive into the research topics of 'Integrative rapid-prototyping for multimodal user interfaces'. Together they form a unique fingerprint.

Cite this