Human-Computer Interaction

Prof. Dr. Marc Erich Latoschik

Head of Chair

Contact Details

Prof. Dr. Marc Erich Latoschik
Human-Computer Interaction
Universität Würzburg
Am Hubland
D-97074 Würzburg

✆	+49 (0) 931 31 85871
⌂	Room 1.021, Building M1, Hubland South

Consultation-Hour
by appointment
PGP Key Fingerprint
08 4E 7A 2B 8E 9C FE 5E 78 A5 EA 01 67 6C 36 23

About me

Marc is heading the chair for Human-Computer Interaction (HCI) of the University of Würzburg. The chair spans a broad area of HCI topics and research groups, from immersive and interactive systems, games engineering, or media informatics, to digital media processing.

Research Questions

How will we interact with current and future computing systems?

How do design choices, implemented metaphors, and technological finesse impact users concerning individual as well as social consequences?

In short: How do we build good interactive systems and how do we define good under this perspective?

Details

Marc works on these questions throughout his career to create technology that is helpful and enjoyable for people. He studied mathematics and computer science at the University of Paderborn, the New York Institute of Technology and the Bielefeld University. Marc has an extensive background in the computer industry where he worked on various, back then entirely novel IT topics, e.g., interfaces for optical storage media and multimedia applications for a variety of large corporations and key players in the IT business. In 1996 he finally decided to devote all his time on his research questions and joined academia. He headed the AI & VR Lab at Bielefeld University until 2007, became a professor for media informatics at the University of Applied Sciences (HTW) in Berlin, and founded the Intelligent Graphics Group at Bayreuth University in 2009 before he finally took over the HCI chair at Würzburg University in 2011.

Marc’s work has a strong interdisciplinary background. He combines methods and approaches from artificial intelligence, real-time interactive systems, 3D graphics, cognitive sciences, and psychology on top of a strong engineering foundation in computer science. Since his seminal work on multimodal–gesture and speech–interaction in Virtual Reality from the late 90th, he is interested in highly interactive and immersive interfaces of Virtual, Augmented, and Mixed Reality (VR, AR, MR). Current research topics focus on embodiment, i.e., of avatars and agents, body ownership and the Proteus effect, time perception, social VR, multimodal input/output, gamification, technical characteristics of, and engineering solutions for VR/AR, and application areas for therapy (e.g., of neurological deficits or obesity disorders), training, learning, and entertainment.

Marc is an active member of the scientific community. He is a co-founder and was the elected spokesman for the GI special interest group on VR/AR, member of the ACM, IEEE, and GI, and serves in numerous program and organizing committees for prestigious conferences, journals, as well as in reviewing panels for various scientific selection processes. He has published more than 160 research articles (see below) in high-ranked journals (e.g., TVCG, Frontiers) and conferences (e.g., IEEE VR, VRST, ISMAR, CHI, ICMI, SUI). Several of his works won prestigious awards and are repeatedly founded by research agencies, e.g., within the Federal State of Bavaria, the German Ministry of Science and Education (BMBF), the German organization for science and research (DFG), or the European Commission.

In his ever-decreasing spare time, Marc loves all kinds of sports and enjoys reading, e.g., books by Stephenson, Stross, Boyle, Murakami, Voltaire, Melville and many more. He likes to explore new tech, and very much enjoys a good conversation on exciting and enlightening topics during a wine or cold beer. Oh, one more thing: dogs and rock’n’roll!



Projects

ViTras - Virtual Reality Therapy by Stimulation of Modulated Body Perception
ViTraS develops therapy methods based on controlled modulation of body perception and behaviour patterns with the help of current Virtual and Augmented Reality (VR/AR) technologies.
ViLeArn
The ViLeArn project develops and researches situated virtual learning environments based on presence and social interaction.
VIRTUALTIMES
The sense of time co-constitutes our subjective experience and embodied self-consciousness. It refers to the dimensions of passage of time (time passing by) and structure of time (serial order of events). We will systematically enrich the scenery both physically (objects) and socially (interaction partners). In the realm of time-based interventions this will allow to manipulate passage of time and structure of time.
Interactive OPERA
Interactive virtual environments for Objective evaluation of Psychophysical Effects based on bRain Activity.
Digitalisierungszentrum Präzisions- und Telemedizin (DZ.PTM)
This project serves to establish the structure of the DZ.PTM Würzburg–Bad Kissingen as a virtual centre with three locations designated to develop, test and implement digitisation projects across Bavaria to support patient care and research.
VR Gait
This project investigate how Virtual Reality can support motor-rehabilitation of patients suffering from stroke consequences.
Homecoming
An Immersive and Gamified Virtual Reality Rehabilitation System to Increase Motivation During Gait Rehabilitation
Embodiment Lab
Embodied multimodal interfaces and digital humans become increasingly interesting for novel human-computer interface paradigms. The EmbodimentLab establishes a Bavarian competence center for the creative development of related application use cases and the necessary technology involved.
Holopark - Large Scale VR Museum
Investation of the oportunities of large scale VR systems with high numbers of users.
Breaking Bad Behaviors
A New Tool for Learning Classroom Management Using Virtual Reality
Dimensions of Virtual Body Ownership
Users experience the perception of virtual bodies in immersive and semi immersive virtual environments.
Augmenting Social Behavior
The project explores the plasticity of communicative social behavior and its augmentation. We aim at finding out how future interactions in social virtual- and augmented realities can be supported by technology.
GEtiT
GEtiT (Gamified Training Environment for Affine Transformations) achieves an interactive gamified 3D-training of affine transformations by requiring users to apply their affine transformation knowledge in order to solve challenging puzzles presented in an immersive and intuitive 3D environment.
XRoads
XRoads explores novel and multimodal interaction techniques for tabletop games. It focusses Mixed Reality platforms that combine touch, speech, and gestures as input modalities for turn-based and real-time strategy games.
GIB MIR
Multimodal interfaces (MMIs) are a promising alternative human-computer interaction paradigm. They are feasible for a wide rang of environments, yet they are especially suited if interactions are spatially and temporally grounded with an environment in which the user is (physically) situated, like virtual reality, mixed reality, human-robot interaction, and computer games.
SIRIS
SIRIS (Semantic Reflection for Intelligent Realtime Interactive Systems) is a research project which explores novel software architectures for Virtual, Augmented, Mixed Reality and computer games and similar domains.
Simulator X
Simulator X is a research testbed for novel software techniques and architectures for Real-Time Interactive Systems in VR, AR, MR, and computer games. It uses the central concept of semantic reflection based on a highly concurrent actor model to build intelligent multimodal graphics systems.
SEARIS
SEARIS (Software Techniques and Architectures for Real-Time Interactive Systems) is an international research collaboration founded in 2007. Its goal is to advance the field of RIS software engineering.
PASION
PASION (Psychological Augmented Social Interaction Over Networks) explores communication and collaboration for social groups using immersive and mobile displays augmented by implicit communication signals (i.e., bio sensors)
SCIVE
SCIVE (Simulation Core for Intelligent Virtual Environments) explores software techniques combining Artificial Intelligence (AI) methods with Virtual and Augmented Reality (VR/AR).
Virtuelle Werkstatt (Virtual Workplace)
The project's goal is the development of a demonstration platform for Virtual-Reality-based prototyping using multimodal (gesture and speech) interaction metaphors.
DEIKON
DEIKON (DEixis In KonstruktionsDialogen), part of the SFB 360, explores the utilization of deictic expressions in gesture and speech as input methods for construction scenarios.
MAX
We have contributed to the MAX project (Multimodal Assembly eXpert) by a port to an immersive environment and the implementation of multimodal speech and gesture input.
Virtual Constructor
We have contributed to the Virtual Constructor project by developing an immersive renderer supporting multimodal input.
SGIM
SGIM (Speech and Gesture Interfaces for Multimedia) develops techniques for communicating with multimedia systems through the detection and interpretation of a user's verbal (speech) and coarse gestural input.
AI & VR Lab
The AI & VR Lab of Bielefeld University founded by Prof. Wachsmuth and headed by Prof. Latoschik hosted several novel projects in the area of intelligent graphics and intelligent Virtual Environments.
VIENA
VIENA (Virtual Environments and Agents) explored the use of a multi agent system to capture natural language and coarse pointing gestures to interact with an interior design application.


Publications

Legal Information