Human-Computer Interaction

Research Projects

ViTras - Virtual Reality Therapy by Stimulation of Modulated Body Perception
ViTraS develops therapy methods based on controlled modulation of body perception and behaviour patterns with the help of current Virtual and Augmented Reality (VR/AR) technologies.
GAL - Generating asemic languages using deep learning
Winner in the Hochschulwettbewerb of the German Science Year 2019 - Artificial Intelligence. The project is dedicated to the generation of languages without semantic meaning.
ViLeArn
The ViLeArn project develops and researches situated virtual learning environments based on presence and social interaction.
ILAST - Immersive Leg Coordination And Strength Therapy
ILAST is a prototype of an immersive Virtual Reality (VR) training system for post-operative therapy treatment after knee injuries. The training system uses basic game-like experiences simulated in fully immersive virtual environments which motivate patients to perform dedicated movement tasks mobilizing and strengthening their lower limbs.
VIRTUALTIMES
The sense of time co-constitutes our subjective experience and embodied self-consciousness. It refers to the dimensions of passage of time (time passing by) and structure of time (serial order of events). We will systematically enrich the scenery both physically (objects) and socially (interaction partners). In the realm of time-based interventions this will allow to manipulate passage of time and structure of time.
Interactive OPERA
Interactive virtual environments for Objective evaluation of Psychophysical Effects based on bRain Activity.
Digitalisierungszentrum Präzisions- und Telemedizin (DZ.PTM)
This project serves to establish the structure of the DZ.PTM Würzburg–Bad Kissingen as a virtual centre with three locations designated to develop, test and implement digitisation projects across Bavaria to support patient care and research.
VR Gait
This project investigate how Virtual Reality can support motor-rehabilitation of patients suffering from stroke consequences.
Homecoming
An Immersive and Gamified Virtual Reality Rehabilitation System to Increase Motivation During Gait Rehabilitation
Embodiment Lab
Embodied multimodal interfaces and digital humans become increasingly interesting for novel human-computer interface paradigms. The EmbodimentLab establishes a Bavarian competence center for the creative development of related application use cases and the necessary technology involved.
Holopark - Large Scale VR Museum
Investation of the oportunities of large scale VR systems with high numbers of users.
Breaking Bad Behaviors
A New Tool for Learning Classroom Management Using Virtual Reality
Interactive Memories (InterMem)
InterMem explores the usefulness of multimodal and multimedia interfaces with an increased perceptual coupling to strengthen the positive effects of biography work with patients suffering from dementia.
Interpersonal Synchronization in Virtual Realities
The project explores aspects of social interactions, behavioral patterns, and hybrid avatar-agent systems in Virtual Realities.
HistStadt4D
HistStadt4D: 'Multimodal access to historic image repositories to support the research and communication of city and architectural history' is a BMBF-funded junior scientist group. The research group investigates and develops methodical and technological approaches in order to merge, structure and annotate images in media repositories and additional information related to their place and time.
Dimensions of Virtual Body Ownership
Users experience the perception of virtual bodies in immersive and semi immersive virtual environments.
Augmenting Social Behavior
The project explores the plasticity of communicative social behavior and its augmentation. We aim at finding out how future interactions in social virtual- and augmented realities can be supported by technology.
GEtiT
GEtiT (Gamified Training Environment for Affine Transformations) achieves an interactive gamified 3D-training of affine transformations by requiring users to apply their affine transformation knowledge in order to solve challenging puzzles presented in an immersive and intuitive 3D environment.
XRoads
XRoads explores novel and multimodal interaction techniques for tabletop games. It focusses Mixed Reality platforms that combine touch, speech, and gestures as input modalities for turn-based and real-time strategy games.
GIB MIR
Multimodal interfaces (MMIs) are a promising alternative human-computer interaction paradigm. They are feasible for a wide rang of environments, yet they are especially suited if interactions are spatially and temporally grounded with an environment in which the user is (physically) situated, like virtual reality, mixed reality, human-robot interaction, and computer games.
CaveUDK
CaveUDK is a high-level VR middleware based on one of the most successful commercial game engines: the Unreal® Engine 3.0 (UE3). It is a VR framework implemented as an extension to the Unreal® Development Kit (UDK) supporting CAVE-like installations.
Simulator X
Simulator X is a research testbed for novel software techniques and architectures for Real-Time Interactive Systems in VR, AR, MR, and computer games. It uses the central concept of semantic reflection based on a highly concurrent actor model to build intelligent multimodal graphics systems.
SEARIS
SEARIS (Software Techniques and Architectures for Real-Time Interactive Systems) is an international research collaboration founded in 2007. Its goal is to advance the field of RIS software engineering.
SIRIS
SIRIS (Semantic Reflection for Intelligent Realtime Interactive Systems) is a research project which explores novel software architectures for Virtual, Augmented, Mixed Reality and computer games and similar domains.
Games and Interactive Media
Research and education collaboration during the time at the HTW Berlin. The project is now continued in several new activities.
SCIVE
SCIVE (Simulation Core for Intelligent Virtual Environments) explores software techniques combining Artificial Intelligence (AI) methods with Virtual and Augmented Reality (VR/AR).
PASION
PASION (Psychological Augmented Social Interaction Over Networks) explores communication and collaboration for social groups using immersive and mobile displays augmented by implicit communication signals (i.e., bio sensors)
AI & VR Lab
The AI & VR Lab of Bielefeld University founded by Prof. Wachsmuth and headed by Prof. Latoschik hosted several novel projects in the area of intelligent graphics and intelligent Virtual Environments.
Virtuelle Werkstatt (Virtual Workplace)
The project's goal is the development of a demonstration platform for Virtual-Reality-based prototyping using multimodal (gesture and speech) interaction metaphors.
MAX
We have contributed to the MAX project (Multimodal Assembly eXpert) by a port to an immersive environment and the implementation of multimodal speech and gesture input.
DEIKON
DEIKON (DEixis In KonstruktionsDialogen), part of the SFB 360, explores the utilization of deictic expressions in gesture and speech as input methods for construction scenarios.
Virtual Constructor
We have contributed to the Virtual Constructor project by developing an immersive renderer supporting multimodal input.
SGIM
SGIM (Speech and Gesture Interfaces for Multimedia) develops techniques for communicating with multimedia systems through the detection and interpretation of a user's verbal (speech) and coarse gestural input.
VIENA
VIENA (Virtual Environments and Agents) explored the use of a multi agent system to capture natural language and coarse pointing gestures to interact with an interior design application.
Legal Information