Human-Computer Interaction

GIB MIR

Overview

Multimodal interfaces (MMIs) are a promising alternative human-computer interaction paradigm. They are feasible for a wide rang of environments, yet they are especially suited if interactions are spatially and temporally grounded with an environment in which the user is (physically) situated, like virtual reality, mixed reality, human-robot interaction, and computer games. We use a concurrent Augmented Transition Network (cATN) to implement multimodal interfaces for a broad spectrum of demonstrations and research. It is a successor of the temporal Augmented Transition Network and is implemented in Simulator X.

Demos

Robot Museum - A Multimodal Virtual Reality Game
We are happy to present a demonstration of David Heidrich's scientific internship implemented in summer semester 2018.
Space Tentacle - A Multimodal Adventure Game
We are happy to present a demonstration by Chris Zimmerer and Dr. Martin Fischbach.
Big Bang - A Multimodal VR Universe Builder
We are happy to present a proof of concept demonstration of Chris Zimmerer's master thesis.
Quest V2 Prototype Finished
The Quest V2 prototype is a mixed reality tabletop role-playing game with a novel combination of interaction styles and gameplay mechanics.

News

Best Paper Runner-Up at ICMI‘19
This year's joint contribution of the HCI chair and the Psychological Ergonomics chair 'Paint that object yellow: Multimodal Interaction to Enhance Creativity During Design Tasks in VR' was awarded as Best Paper Runner-Up. Congratulations to Erik, Sara, Chris, Jean-Luc, and Marc!
Multimodal Interaction Course Results of SS 2019
The semester comes to an end and we are happy to present the results of the Multimodal Interaction course.
HCI Group published in the Multimodal Technologies and Interaction Journal of the MDPI
Chris Zimmerer, Dr. Martin Fischbach and Prof. Dr. Marc Erich Latoschik published their recent work on the development of multimodal interfaces.
Multimodal Interaction Course Results of SS 2018
The semester comes to an end and we are happy to present the results of the Multimodal Interaction course.
Module and Project Results of WS 2017/18
The semester comes to an end and we are happy to present some great results of lecture modules and projects.
HCI Group Presents 1 Journal, 4 Conference and 1 SEARIS Paper and 4 Posters at the IEEE VR 2018 in Reutlingen
This year our group is strongly represented at the IEEE VR 2018, the largest VR conference of the research community.
German-Japanese Spring School on Human Factors 2018
Here you find Japanese descriptions of our research demos for the German-Japanese Spring School on Human Factors 2018 in collaboration of Psychological Ergonomics and HCI.
Module and Project Results of SS 2017
The semester comes to an end and we are happy to present some great results of lecture modules and projects.
Public PhD Thesis Defense by Martin Fischbach
Martin Fischbach publicly defenses his thesis at 4. August 2017, 10:00 in Z6, SE 1.010
Module and Project Results of WS 2016/17
The semester comes to an end and we are happy to present some great results of lecture modules and projects.
Module, Project and Thesis Results of SS 2016
The semester comes to an end and we are happy to present some great results of lecture modules, projects, and thesis.
Module, Project and Thesis Results of WS 2015/16
The semester comes to an end and we are happy to present some great results of lecture modules, projects, and thesis.
XRoads Demo at the Mobile Media Day
Berit Barkschat and Sascha Link present the latest results of the ongoing project XRoads at the Mobile Media Day in Würzburg.
Presentation at the ICMI '15 in Seattle
Martin Fischbach will present an extended exposé of his PhD thesis on software techniques for multimodal input processing in Realtime Interactive Systems at the International Conference on Multimodal Interaction in Seattle.

Publications

Thesis and projects

Open

Conception and Development of a Multimodal VR Adventure Game
The aim of this project is to design and implement a multimodal VR adventure game.
Application Programming Interface (API) for designing Multimodal Interfaces
The goal of this project is to create a suitable user interface for developers, i.e., an API, to create natural multimodal interfaces for virtual-, augmented-, and mixed reality.
Mixed Reality Boardgames
XRoads explores novel and multimodal interaction techniques for tabletop games. It focusses Mixed Reality platforms that combine touch, speech, and gestures as input modalities for turn-based and real-time strategy games.
Sprachein- und -ausgabe für Virtual Reality (VR) am Beispiel der Unreal Engine
Im Rahmen dieser Arbeit soll das Microsoft Speech SDK an das System stage abgebunden werden. Dies soll sowohl die Eingabe von Texten für Annotationen ermöglichen, als auch als Sprachausgabe von Texten genutzt werden können.

Assigned

Closed

Jonas Müller Master Thesis

Adaptive Distributional Word Models for Robust Semantic Information Systems
This HCI master thesis focuses on distributional word models for robust semantic information systems.

Claudia Mehn Master Thesis

Gesture Recognition and Feature Selection
Tracking systems usually extract the position of body joints and pass them on to a final decision unit which classifies the data as different gestures. These units can be manually specified templates.

Sascha Link Master Thesis

Comparison of multimodal fusion methods for real-time interactive systems - towards a general testbed
This thesis proposes a testbed for comparative evaluations of fusion engines and compares two different fusion approaches as proof of concept: temporal augmented transition networks vs. unification.

For Developers

If you are interested in using the cATN for research or teaching, then contact us. The latest version of our cATN is hosted on our institute’s GitLab server together with additional material that we use for teaching, such as how-tos and practical exercises. Getting access is as easy as sending your type of interest as well as a mail address for registration to one of the primary contact persons linked at bottom of this page.

Team

Legal Information