2026
Franziska Westermeier,
Effects of Incongruencies Across the Reality-Virtuality Continuum
.
2026.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@phdthesis{westermeier2026effects,
title = {Effects of Incongruencies Across the Reality-Virtuality Continuum},
author = {Westermeier, Franziska},
year = {2026},
url = {https://doi.org/10.25972/OPUS-43370},
doi = {10.25972/OPUS-43370}
}
Abstract: This dissertation examines the perceptual and cognitive effects of incongruencies in eXtended Reality (XR) experiences along the Reality-Virtuality (RV) continuum, with a particular focus on Virtual Reality (VR) and Video See-Through (VST) Augmented Reality (AR). VST AR integrates video images from front-facing cameras on the Head-Mounted Display (HMD) with virtual content. XR HMDs, that are capable of VST AR, oftentimes also include a VR mode. While VR has been extensively studied, VST AR remains underexplored despite rapid advances in camera resolution and rendering techniques. The blending of virtual and real-world elements in VST AR frequently gives rise to perceptual mismatches, such as conflicting depth cues, misaligned virtual objects, and latency discrepancies, that challenge established XR frameworks and may adversely affect user experience.
This dissertation, incorporating five key publications and five empirical experiments, investigates effects of incongruencies in VR and VST AR, by examining both subjective reports and objective behavioral measures. While users may not always consciously detect these mismatches, the empirical findings of this dissertation reveal their significant impact on depth perception, spatial judgments, and performance. A central focus of this work is the application and refinement of the Congruence and Plausibility (CaP) model, which describes how incongruencies operate at different processing levels — from low-level sensory distortions to higher-order cognitive inconsistencies. The results indicate that AR-inherent perceptual incongruencies influence the experience at a subconscious level, challenging existing theoretical frameworks that primarily focused on VR experiences that are visually coherent. To further support this understanding, the dissertation introduces a methodological framework for analyzing and predicting the effects of incongruencies, contributing to the development of coherent and immersive XR applications.
The conducted research affirms both the complexity and promise of VST AR technologies. By disclosing how subconscious factors interact with users’ conscious perceptions, this dissertation enriches theoretical understanding and provides strategies for advancing XR research.
2025
Franziska Westermeier, Chandni Murmu, Kristopher Kohm, Christopher Pagano, Carolin Wienrich, Sabarish V. Babu, Marc Erich Latoschik,
Interpupillary to Inter-Camera Distance of Video See-Through AR and its Impact on Depth Perception
, In
Proceedings of the 32nd IEEE Virtual Reality conference (VR '25)
, pp. 537-547
.
2025.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{westermeier2025interpupillary,
title = {Interpupillary to Inter-Camera Distance of Video See-Through AR and its Impact on Depth Perception},
author = {Westermeier, Franziska and Murmu, Chandni and Kohm, Kristopher and Pagano, Christopher and Wienrich, Carolin and V. Babu, Sabarish and Latoschik, Marc Erich},
booktitle = {Proceedings of the 32nd IEEE Virtual Reality conference (VR '25)},
year = {2025},
pages = {537-547},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevr-ipd-icd.pdf},
doi = {10.1109/VR59515.2025.00077}
}
Abstract: Interpupillary distance (IPD) is a crucial characteristic of head-mounted displays (HMDs) because it defines an important property for generating a stereoscopic parallax, which is essential for correct depth perception. This is why contemporary HMDs offer adjustable lenses to adapt to users' individual IPDs.
However, today's Video See-Through Augmented Reality (VST AR) HMDs use fixed camera placements to reconstruct the stereoscopic view of a user's environment.
This leads to a potential mismatch between individual IPD settings and the fixed Inter-Camera Distances (ICD), which in turn can lead to perceptual incongruencies, limiting the usability and potentially the applicability of VST AR in depth-sensitive use cases. To investigate this incongruency between IPD and ICD, we conducted a 2x3 mixed-factor design empirical evaluation using a near-field, open-loop reaching task comparing distance judgments of Virtual Reality (VR) and VST AR. We also explored improvements in reaching performance via perceptual calibration by incorporating a feedback phase between pre- and post-phase conditions, with a particular focus on the influence of IPD-ICD differences. Our Linear Mixed Model (LMM) analysis showed a significant difference between VR and VST AR, a significant effect of IPD-ICD mismatch, as well as a combined effect of both factors. This novel insight and its consequences are discussed specifically for depth perception tasks in AR, eXtended Reality (XR), and potential use cases.
Joanna Grause, Larissa Brübach, Franziska Westermeier, Carolin Wienrich, Marc Erich Latoschik,
The Stability of Plausibility and Presence in Claustrophobic Virtual Reality Exposure Therapy
, In
Proceedings of the Mensch Und Computer 2025
, p. 181–192
.
New York, NY, USA
:
Association for Computing Machinery
, 2025.
[BibTeX]
[Download]
[BibSonomy]
[Doi]
@inproceedings{noauthororeditor2025stability,
title = {The Stability of Plausibility and Presence in Claustrophobic Virtual Reality Exposure Therapy},
author = {Grause, Joanna and Brübach, Larissa and Westermeier, Franziska and Wienrich, Carolin and Latoschik, Marc Erich},
booktitle = {Proceedings of the Mensch Und Computer 2025},
year = {2025},
pages = {181–192},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3743049.3743068},
doi = {10.1145/3743049.3743068}
}
Larissa Brübach, Deniz Celikhan, Lennard Rüffert, Franziska Westermeier, Marc Erich Latoschik, Carolin Wienrich,
When Fear Overshadows Perceived Plausibility: The Influence of Incongruencies on Acrophobia in VR
, In
Proceedings of the 32nd IEEE Virtual Reality conference (VR '25)
.
IEEE Computer Science
, 2025.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@proceedings{brubach2025overshadows,
title = {When Fear Overshadows Perceived Plausibility: The Influence of Incongruencies on Acrophobia in VR},
author = {Brübach, Larissa and Celikhan, Deniz and Rüffert, Lennard and Westermeier, Franziska and Latoschik, Marc Erich and Wienrich, Carolin},
booktitle = {Proceedings of the 32nd IEEE Virtual Reality conference (VR '25)},
year = {2025},
publisher = {IEEE Computer Science},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevr-bruebach-height-and-plausibility-preprint.pdf},
doi = {10.1109/VR59515.2025.00089}
}
Abstract: Virtual Reality Exposure Therapy (VRET) has become an effective, customizable, and affordable treatment for various psychological and physiological disorders. Specifically, it is used to treat specific anxiety disorders, such as acrophobia or arachnophobia, for decades. However, to ensure a positive outcome for patients, we must understand and control the effects potentially caused by the technology and medium of Virtual Reality (VR) itself. This article specifically investigates the impact of the Plausibility illusion (Psi), as one of the two theorized presence components, on the fear of heights. In two experiments, 30 participants each experienced two different heights with congruent and incongruent object behaviors in a 2 x 2 within-subject design. Results show that the strength of the congruence manipulation plays a significant role. Only when incongruencies are strong enough will they be recognized by users, specifically in high fear conditions, as triggered by exposure to increased heights. If incongruencies are too subtle, they seem to be overshadowed by the stronger fear reactions. Our evidence contributes to recent theories of VR effects and emphasizes the importance of understanding and controlling factors potentially assumed to be incidental, specifically during VRET designs. Incongruencies should be controlled so that they do not have an unwanted influence on the patient's fear response.
2024
Franziska Westermeier, Larissa Brübach, Carolin Wienrich, Marc Erich Latoschik,
Assessing Depth Perception in VR and Video See-Through AR: A Comparison on Distance Judgment, Performance, and Preference
, In
IEEE Transactions on Visualization and Computer Graphics
, Vol.
30
(
5)
, pp. 2140-2150
.
2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{westermeier2024assessing,
title = {Assessing Depth Perception in VR and Video See-Through AR: A Comparison on Distance Judgment, Performance, and Preference},
author = {Westermeier, Franziska and Brübach, Larissa and Wienrich, Carolin and Latoschik, Marc Erich},
journal = {IEEE Transactions on Visualization and Computer Graphics},
year = {2024},
volume = {30},
number = {5},
pages = {2140 - 2150},
url = {https://ieeexplore.ieee.org/document/10458408},
doi = {10.1109/TVCG.2024.3372061}
}
Abstract: Spatial User Interfaces along the Reality-Virtuality continuum heavily depend on accurate depth perception. However, current display technologies still exhibit shortcomings in the simulation of accurate depth cues, and these shortcomings also vary between Virtual or Augmented Reality (VR, AR: eXtended Reality (XR) for short). This article compares depth perception between VR and Video See-Through (VST) AR. We developed a digital twin of an existing office room where users had to perform five depth-dependent tasks in VR and VST AR. Thirty-two participants took part in a user study using a 1×4 within-subjects design. Our results reveal higher misjudgment rates in VST AR due to conflicting depth cues between virtual and physical content. Increased head movements observed in participants were interpreted as a compensatory response to these conflicting cues. Furthermore, a longer task completion time in the VST AR condition indicates a lower task performance in VST AR. Interestingly, while participants rated the VR condition as easier and contrary to the increased misjudgments and lower performance with the VST AR display, a majority still expressed a preference for the VST AR experience. We discuss and explain these findings with the high visual dominance and referential power of the physical content in the VST AR condition, leading to a higher spatial presence and plausibility.
Larissa Brübach, Mona Röhm, Franziska Westermeier, Marc Erich Latoschik, Carolin Wienrich,
Manipulating Immersion: The Impact of Perceptual Incongruence on Perceived Plausibility in VR
, In
23rd IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
.
IEEE Computer Society
, 2024.
IEEE ISMAR Best Paper Nominee 🏆
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{brubach2024manipulating,
title = {Manipulating Immersion: The Impact of Perceptual Incongruence on Perceived Plausibility in VR},
author = {Brübach, Larissa and Röhm, Mona and Westermeier, Franziska and Latoschik, Marc Erich and Wienrich, Carolin},
booktitle = {23rd IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
year = {2024},
publisher = {IEEE Computer Society},
note = {IEEE ISMAR Best Paper Nominee 🏆},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2024-ismar-manipulating-immersion.pdf},
doi = {10.1109/ISMAR62088.2024.00124}
}
Abstract: This work presents a study where we used incongruencies on the cognitive and the perceptual layer to investigate their effects on perceived plausibility and, thereby, presence and spatial presence. We used a 2x3 within-subject design with the factors familiar size (cognitive manipulation) and immersion (perceptual manipulation). For the different levels of immersion, we implemented three different tracking qualities: rotation-and-translation tracking, rotation-only tracking, and stereoscopic-view-only tracking. Participants scanned products in a virtual supermarket where the familiar size of these objects was manipulated. Simultaneously, they could either move their head normally or need to use the thumbsticks to navigate their view of the environment. Results show that both manipulations had a negative effect on perceived plausibility and, thereby, presence. In addition, the tracking manipulation also had a negative effect on spatial presence. These results are especially interesting in light of the ongoing discussion about the role of plausibility and congruence in evaluating XR environments. The results can hardly be explained by traditional presence models, where immersion should not be an influencing factor for perceived plausibility. However, they are in agreement with the recently introduced Congruence and Plausibility (CaP) model and provide empirical evidence for the model's predicted pathways.
Larissa Brübach, Marius Röhm, Franziska Westermeier, Carolin Wienrich, Marc Erich Latoschik,
The Influence of a Low-Resolution Peripheral Display Extension on the Perceived Plausibility and Presence
, In
Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology
(
3)
.
New York, NY, USA
:
Association for Computing Machinery
, 2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{brubach2024influence,
title = {The Influence of a Low-Resolution Peripheral Display Extension on the Perceived Plausibility and Presence},
author = {Brübach, Larissa and Röhm, Marius and Westermeier, Franziska and Wienrich, Carolin and Latoschik, Marc Erich},
booktitle = {Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology},
year = {2024},
number = {3},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3641825.3687713},
doi = {10.1145/3641825.3687713}
}
Abstract: The Field of View (FoV) is a central technical display characteristic of Head-Mounted Displays (HMDs), which has been shown to have a notable impact on important aspects of the user experience. For example, an increased FoV has been shown to foster a sense of presence and improve peripheral information processing, but it also increases the risk of VR sickness. This article investigates the impact of a wider but inhomogeneous FoV on the perceived plausibility, measuring its effects on presence, spatial presence, and VR sickness as a comparison to and replication of effects from prior work. We developed a low-resolution peripheral display extension to pragmatically increase the FoV, taking into account the lower peripheral acuity of the human eye. While this design results in inhomogeneous resolutions of HMDs at the display edges, it also is a low complexity and low-cost extension. However, its effects on important VR qualities have to be identified. We conducted two experiments with 30 and 27 participants, respectively. In a randomized 2x3 within-subject design, participants played three rounds of bowling in VR, both with and without the display extension. Two rounds contained incongruencies to induce breaks in plausibility. In experiment 2, we enhanced one incongruency to make it more noticeable and improved the shortcomings of the display extension that had previously been identified. However, neither study measured the low-resolution FoV extension's effect in terms of perceived plausibility, presence, spatial presence, or VR sickness. We found that one of the incongruencies could cause a break in plausibility without the extension, confirming the results of a previous study.
2023
Larissa Brübach, Franziska Westermeier, Carolin Wienrich, Marc Erich Latoschik,
A Systematic Evaluation of Incongruencies and Their Influence on Plausibility in Virtual Reality
, In
2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
, pp. 894-901
.
IEEE Computer Society
, 2023.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{brubach2023systematic,
title = {A Systematic Evaluation of Incongruencies and Their Influence on Plausibility in Virtual Reality},
author = {Brübach, Larissa and Westermeier, Franziska and Wienrich, Carolin and Latoschik, Marc Erich},
booktitle = {2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
year = {2023},
pages = {894-901},
publisher = {IEEE Computer Society},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2023-ismar-bruebach-a-systematic-evaluation-of-incongruencies-preprint.pdf},
doi = {10.1109/ISMAR59233.2023.00105}
}
Abstract: Currently, there is an ongoing debate about the influencing factors of one's extended reality (XR) experience. Plausibility, congruence, and their role have recently gained more and more attention. One of the latest models to describe XR experiences, the Congruence and Plausibility model (CaP), puts plausibility and congruence right in the center. However, it is unclear what influence they have on the overall XR experience and what influences our perceived plausibility rating. In this paper, we implemented four different incongruencies within a virtual reality scene using breaks in plausibility as an analogy to breaks in presence. These manipulations were either located on the cognitive or perceptual layer of the CaP model. They were also either connected to the task at hand or not. We tested these manipulations in a virtual bowling environment to see which influence they had. Our results show that manipulations connected to the task caused a lower perceived plausibility. Additionally, cognitive manipulations seem to have a larger influence than perceptual manipulations. We were able to cause a break in plausibility with one of our incongruencies. These results show a first direction on how the influence of plausibility in XR can be systematically investigated in the future.
Franziska Westermeier, Larissa Brübach, Carolin Wienrich, Marc Erich Latoschik,
A Virtualized Augmented Reality Simulation for Exploring Perceptual Incongruencies
, In
Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology
.
New York, NY, USA
:
Association for Computing Machinery
, 2023.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{westermeier2023virtualized,
title = {A Virtualized Augmented Reality Simulation for Exploring Perceptual Incongruencies},
author = {Westermeier, Franziska and Brübach, Larissa and Wienrich, Carolin and Latoschik, Marc Erich},
booktitle = {Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology},
year = {2023},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3611659.3617227},
doi = {10.1145/3611659.3617227}
}
Abstract: When blending virtual and physical content, certain incongruencies emerge from hardware limitations, inaccurate tracking, or different appearances of virtual and physical content. They restrain us from perceiving virtual and physical content as one experience. Hence, it is crucial to investigate these issues to determine how they influence our experience. We present a virtualized augmented reality simulation that can systematically examine single incongruencies or different configurations.
Franziska Westermeier, Larissa Brübach, Marc Erich Latoschik, Carolin Wienrich,
Exploring Plausibility and Presence in Mixed Reality Experiences
, In
IEEE Transactions on Visualization and Computer Graphics
, Vol.
29
(
5)
, pp. 2680-2689
.
2023.
IEEE VR Best Paper Nominee 🏆
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{westermeier2023exploring,
title = {Exploring Plausibility and Presence in Mixed Reality Experiences},
author = {Westermeier, Franziska and Brübach, Larissa and Latoschik, Marc Erich and Wienrich, Carolin},
journal = {IEEE Transactions on Visualization and Computer Graphics},
year = {2023},
volume = {29},
number = {5},
pages = {2680-2689},
note = {IEEE VR Best Paper Nominee 🏆},
url = {https://ieeexplore.ieee.org/document/10049710},
doi = {10.1109/TVCG.2023.3247046}
}
Abstract: Mixed Reality (MR) applications along Milgram's Reality-Virtuality (RV) continuum motivated a number of recent theories on potential constructs and factors describing MR experiences. This paper investigates the impact of incongruencies on the sensation/perception and cognition layers to provoke breaks in plausibility, and the effects these breaks have on spatial and overall presence as prominent constructs of Virtual Reality (VR). We developed a simulated maintenance application to test virtual electrical devices. Participants performed test operations on these devices in a counterbalanced, randomized 2x2 between-subject design in either VR as congruent, or Augmented Reality (AR) as incongruent on the sensation/perception layer. Cognitive incongruency was induced by the absence of traceable power outages, decoupling perceived cause and effect after activating potentially defective devices. Our results indicate significant differences in the plausibility ratings between the VR and AR conditions, hence between congruent/incongruent conditions on the sensation/perception layer. In addition, spatial presence revealed a comparable interaction pattern with the VR vs AR conditions. Both factors decreased for the AR condition (incongruent sensation/perception) compared to VR (congruent sensation/perception) for the congruent cognitive case but increased for the incongruent cognitive case. The results are discussed and put into perspective in the scope of recent theories of MR experiences.
2022
Larissa Brübach, Franziska Westermeier, Carolin Wienrich, Marc Erich Latoschik,
Breaking Plausibility Without Breaking Presence - Evidence For The Multi-Layer Nature Of Plausibility
, In
IEEE Transactions on Visualization and Computer Graphics
, Vol.
28
(
5)
, pp. 2267-2276
.
2022.
IEEE VR Best Journal Paper Nominee 🏆
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{9714117,
title = {Breaking Plausibility Without Breaking Presence - Evidence For The Multi-Layer Nature Of Plausibility},
author = {Brübach, Larissa and Westermeier, Franziska and Wienrich, Carolin and Latoschik, Marc Erich},
journal = {IEEE Transactions on Visualization and Computer Graphics},
year = {2022},
volume = {28},
number = {5},
pages = {2267-2276},
note = {IEEE VR Best Journal Paper Nominee 🏆},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2022-ieeevr-breaking-plausibility-without-breaking-presence.pdf},
doi = {10.1109/TVCG.2022.3150496}
}
Abstract: A novel theoretical model recently introduced coherence and plausibility as the essential conditions of XR experiences, challenging contemporary presence-oriented concepts. This article reports on two experiments validating this model, which assumes coherence activation on three layers (cognition, perception, and sensation) as the potential sources leading to a condition of plausibility and from there to other XR qualia such as presence or body ownership. The experiments introduce and utilize breaks in plausibility (in analogy to breaks in presence): We induce incoherence on the perceptual and the cognitive layer simultaneously by a simulation of object behaviors that do not conform to the laws of physics, i.e., gravity. We show that this manipulation breaks plausibility and hence confirm that it results in the desired effects in the theorized condition space but that the breaks in plausibility did not affect presence. In addition, we show that a cognitive manipulation by a storyline framing is too weak to successfully counteract the strong bottom-up inconsistencies. Both results are in line with the predictions of the recently introduced three-layer model of coherence and plausibility, which incorporates well-known top-down and bottom-up rivalries and its theorized increased independence between plausibility and presence.
2020
Elisabeth Ganal, Andrea Bartl, Franziska Westermeier, Daniel Roth, Marc Erich Latoschik,
Developing a Study Design on the Effects of Different Motion Tracking Approaches on the User Embodiment in Virtual Reality
, In
Mensch und Computer 2020
C. Hansen, A. Nürnberger, B. Preim (Eds.),
.
Gesellschaft für Informatik e.V.
, 2020.
[BibTeX]
[Download]
[BibSonomy]
[Doi]
@inproceedings{https://doi.org/10.18420/muc2020-ws134-341,
title = {Developing a Study Design on the Effects of Different Motion Tracking Approaches on the User Embodiment in Virtual Reality},
author = {Ganal, Elisabeth and Bartl, Andrea and Westermeier, Franziska and Roth, Daniel and Latoschik, Marc Erich},
editor = {Hansen, C. and Nürnberger, A. and Preim, B.},
journal = {Mensch und Computer 2020},
year = {2020},
publisher = {Gesellschaft für Informatik e.V.},
url = {https://dl.gi.de/bitstream/handle/20.500.12116/33557/muc2020-ws-341.pdf?sequence=1&isAllowed=y},
doi = {10.18420/MUC2020-WS134-341}
}
Daniel Schlör, Albin Zehe, Konstantin Kobs, Blerta Veseli, Franziska Westermeier, Larissa Brübach, Daniel Roth, Marc Erich Latoschik, Andreas Hotho,
Improving Sentiment Analysis with Biofeedback Data
, In
Proceedings of the Workshop on peOple in laNguage, vIsiOn and the miNd (ONION)
.
2020.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
@inproceedings{schlor2020improving,
title = {Improving Sentiment Analysis with Biofeedback Data},
author = {Schlör, Daniel and Zehe, Albin and Kobs, Konstantin and Veseli, Blerta and Westermeier, Franziska and Brübach, Larissa and Roth, Daniel and Latoschik, Marc Erich and Hotho, Andreas},
booktitle = {Proceedings of the Workshop on peOple in laNguage, vIsiOn and the miNd (ONION)},
year = {2020},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2018-ieeevr-lugrin-vr-teacher-training/2020-onion-sentiment-eeg-preprint.pdf}
}
Abstract: Humans frequently are able to read and interpret emotions of others by directly taking verbal and non-verbal signals in human-to-human communication into account or to infer or even experience emotions from mediated stories. For computers, however, emotion recognition is a complex problem: Thoughts and feelings are the roots of many behavioural responses and they are deeply entangled with neurophysiological changes within humans. As such, emotions are very subjective, often are expressed in a subtle manner, and are highly depending on context. For example, machine learning approaches for text-based sentiment analysis often rely on incorporating sentiment lexicons or language models to capture the contextual meaning. This paper explores if and how we further can enhance sentiment analysis using biofeedback of humans which are experiencing emotions while reading texts. Specifically, we record the heart rate and brain waves of readers that are presented with short texts which have been annotated with the emotions they induce. We use these physiological signals to improve the performance of a lexicon-based sentiment classifier. We find that the combination of several biosignals can improve the ability of a text-based classifier to detect the presence of a sentiment in a text on a per-sentence level.
2019
Daniel Roth, Larissa Brübach, Franziska Westermeier, Christian Schell, Tobias Feigl, Marc Erich Latoschik,
A Social Interaction Interface Supporting Affective
Augmentation Based on Neuronal Data
, In
Symposium on Spatial User Interaction (SUI '19), October 19--20, 2019, New Orleans, LA, USA
, Vol.
SUI '19
.
2019.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{roth2019toappearsocial,
title = {A Social Interaction Interface Supporting Affective
Augmentation Based on Neuronal Data},
author = {Roth, Daniel and Brübach, Larissa and Westermeier, Franziska and Schell, Christian and Feigl, Tobias and Latoschik, Marc Erich},
booktitle = {Symposium on Spatial User Interaction (SUI '19), October 19--20, 2019, New Orleans, LA, USA},
year = {2019},
volume = {SUI '19},
url = {},
doi = {10.1145/3357251.3360018}
}
Abstract: In this demonstration we present a prototype for an avatar-mediated
social interaction interface that supports the replication of head-
and eye movement in distributed virtual environments. In addition
to the retargeting of these natural behaviors, the system is capable
of augmenting the interaction based on the visual presentation of
affective states. We derive those states using neuronal data captured
by electroencephalographic (EEG) sensing in combination with a
machine learning driven classification of emotional states.
Daniel Roth, Franziska Westermeier, Larissa Brübach, Tobias Feigl, Christian Schell, Marc Erich Latoschik,
Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions
, In
Mensch und Computer 2019 - Workshopband
.
Bonn
:
Gesellschaft für Informatik e.V.
, 2019.
[BibTeX]
[Download]
[BibSonomy]
[Doi]
@conference{roth2019toappearbrain,
title = {Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions},
author = {Roth, Daniel and Westermeier, Franziska and Brübach, Larissa and Feigl, Tobias and Schell, Christian and Latoschik, Marc Erich},
booktitle = {Mensch und Computer 2019 - Workshopband},
year = {2019},
publisher = {Gesellschaft für Informatik e.V.},
address = {Bonn},
url = {https://dl.gi.de/handle/20.500.12116/25205},
doi = {10.18420/muc2019-ws-571}
}