Human-Computer Interaction

Interaction SuitCase


Description

Background

The Behavioral Framework for immersive Technologies (BehaveFiT) is a guide for creating immersive interventions. It presents four main factors that can be manipulated in VR environments to promote attitudinal and behavioral change (Wienrich, et al., 2020). One of these manipulable factors are the virtual objects and their appearance, interactivity, etc. Object integration in VR is almost limitless and can be beautifully integrated into foreign language learning. Yang and Liao enabled, in an AR application, students to translate, rotate, scale, and change virtual objects in 3D using intuitive hand gestures. This allowed cultural content to be easily and quickly visualized and thus paraphrased (Yang, and Liao, 2014). However, if a large number of objects are to be presented at the same time, e.g. in a suitcase, some objects lie automatically on top and fall more into the field of vision.

In this project we want to test different ways of presenting and interacting with objects.

Tasks

Prerequisites

Required:


Contact Persons at the University Würzburg

Rebecca Hein (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
rebecca.hein@uni-wuerzburg.de

Carolin Wienrich
Mensch-Computer-Interaktion, Universität Würzburg
carolin.wienrich@uni-wuerzburg.de

Marc Erich Latoschik
Mensch-Computer-Interaktion, Universität Würzburg
marc.latoschik@uni-wuerzburg.de

References

Wienrich, C., Döllinger, N. I., and Hein, R. (2020). Mind the Gap: A Framework (Behave-FIT) Guiding The Use of Immersive Technologies in Behavior Change Processes.arXiv preprintarXiv:2012.10912.

Yang, M. T., and Liao, W. C. (2014). Computer-assisted culture learning in an online augmented reality environment based on free-hand gesture interaction.IEEE Transactions on Learning Technologies, 7(2), 107-117.

Legal Information