Abstract
The aim of the presented work is to build a generic human computer interface construction kit for potential future applications. The concept permits rapid-prototyping or even final implementations with either a basic set of interaction concepts or adapted solutions. Multimodal control is an integrative factor. Basic input modalities are natural speech and haptic control via touch gestures or conventional devices. Output is basically given on a two-dimensional display plus speech. Future modalities can be connected by a defined communication protocol. An interface description language called IDL helps to build an interface-, an intention and an output model for an interface. A special feature is the ability to profile the user and integrate contextual system and external knowledge online. The achieved data is used to constrain the hypotheses sphere of the recognition instances and provides more robust recognition results. First interfaces have been produced at our institute and proven robust. The high acceptance among first test persons claimed for further research in this area.
Translated title of the contribution | Integrative rapid-prototyping for multimodal user interfaces |
---|---|
Original language | German |
Pages (from-to) | 279-284 |
Number of pages | 6 |
Journal | VDI Berichte |
Issue number | 1678 |
State | Published - 2002 |