Compensation for intersensory discomfort in real time
Principal Investigators
Prof. Dr. Albrecht Schmidt, Ludwig Maximilian University of Munich (Homepage)
Prof. Dr-Ing. Katrin Wolf, Berlin University of Technology (BHT) (Homepage)
Sensory illusions are the basis for believable mixed reality experiences. Visual illusions, such as animated images, have already been well researched. However, the illusions that are essential for a more realistic fusion of physical and virtual impressions often work poorly and can cause discomfort in many people. Multimodal illusions occur when sensory modalities provide conflicting information and one sensory modality overrides the perceived information with another, so that the overall impression appears consistent. To date, there has been little research into how multimodal illusions can create a convincing and discomfort-free mixed reality experience. Several research projects, including our previous work, have demonstrated the feasibility of engineering such illusions using various individual phenomena. In this project we want to systematically research multimodal integration in mixed reality. The central aspect here is the phenomenon of feeling unwell in mixed reality, also known as cybersickness. If the multisensory information is incoherent, people react to it. For example, we feel uncomfortable. Motion sickness and cybersickness are phenomena in which this discrepancy between felt and seen movement could not be integrated into a coherent perception. Our aim is to use physiological measurements to detect the onset of such a discrepancy before people feel unwell. If this is possible, we could correct the discrepancy and create a working illusion in mixed reality. Here we refer to research that has investigated the conditions under which intersensory integration can be technically realized. This should serve as a basis for systematically creating conceptual models of which sensory information combination can create which sensory illusion. We extend existing static models to include physiological perception to overcome intra- and interpersonal differences inherent in cognitive models. Our vision is to enable the scientific foundations for a new generation of mixed reality systems and applications that can detect and counteract illusion disruption. If we can measure in real time when a user of an interactive system can no longer integrate multisensory information, the system could adjust multisensory output and avoid discomfort. An example of this is adjusting the visual scene once it is realized that an illusion no longer works. This would allow MR technologies to be used by more people and applications, thereby creating a novel interaction paradigm through intersensory discomfort compensation in real time.