Real-time Intersensory Discomfort Compensation, DFG

Funded by the DFG (Priority Program SPP 2199)
Duration: 2024-2027

 
Principal Investigators
Prof. Dr.-Ing. Katrin Wolf, Berlin University of Applied Sciences and Technology
Prof. Dr. Prof. Dr. Albrecht Schmidt, LMU Munich

Sensory illusions are the basis for believable mixed reality experiences. Visual illusions, such as animated images, have already been well-researched. However, the illusions that are essential for a more realistic fusion of physical and virtual impressions often work poorly and can cause discomfort in many people. Multimodal illusions occur when sensory modalities provide conflicting information and one sensory modality overrides the perceived information with another so that the overall impression appears consistent. To date, there has been little research into how multimodal illusions can create a convincing and discomfort-free mixed reality experience. Several research projects, including our previous work, have demonstrated the feasibility of engineering such illusions using various individual phenomena. In this project, we want to systematically research multimodal integration in mixed reality. The central aspect here is the phenomenon of feeling unwell in mixed reality, also known as cybersickness. If the multisensory information is incoherent, people react to it. For example, we feel uncomfortable. Motion sickness and cybersickness are phenomena in which this discrepancy between felt and seen movement cannot be integrated into a coherent perception. We aim to use physiological measurements to detect the onset of such a discrepancy before people feel unwell. If this is possible, we could correct the discrepancy and create a working illusion in mixed reality. Here we refer to research that has investigated the conditions under which intersensory integration can be technically realized. This should serve as a basis for systematically creating conceptual models of which sensory information combination can create which sensory illusion. We extend existing static models to include physiological perception to overcome intra- and interpersonal differences inherent in cognitive models. Our vision is to enable the scientific foundations for a new generation of mixed reality systems and applications that can detect and counteract illusion disruption. If we can measure in real-time when a user of an interactive system can no longer integrate multisensory information, the system could adjust multisensory output and avoid discomfort. An example of this is adjusting the visual scene once it is realized that an illusion no longer works. This would allow MR technologies to be used by more people and applications, thereby creating a novel interaction paradigm through intersensory discomfort compensation in real-time.