Project Details
Interactive Locomotion User Interfaces for Real Walking through Virtual Worlds - From Perception to Application
Subject Area
Human Factors, Ergonomics, Human-Machine Systems
Term
from 2009 to 2023
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 137297816
The main objective of the iLUI project is to allow users to unlimitedly explore an IVE by walking in a confined physical space in order to receive the same benefits for spatial perception known from natural walking in the real world. The project will directly build upon results from our previous LOCUI project. From the limitations of the techniques, which we developed during the previous project, we identified important challenges and enormous potentials to incorporate novel perceptually-inspired methods to achieve the goal of natural walking through virtual worlds. iLUI puts the user in the focus of the approach, concentrating on his ability to adapt to the IVE as a naturalistic situation, his strategies for visual exploration using eye movements, and the view of his body as part of the virtual experience. In particular, to achieve the goals and contribution of the iLUI project, we will: 1. Analyze adaptation to increase the range of gain manipulations for RDW by reducing the sensitivity to manipulations caused by RDW techniques. We will determine the time course and strength of adaptation and the possibility to adapt users to a specific gain range and we will test for any retainment of adaptation effects across sessions. 2. Develop gaze-based RDW techniques by coupling peripheral visual filters to the gaze of the user and evaluate novel gaze-based manipulation such as saccadic suppression, motion blur and depth of field. These will further extend detection thresholds up to which humans can be guided imperceptibly along physical trajectories that produce different sensory and motor signals in comparison to the visual stimuli. 3. Develop feet-based RDW techniques to mask visual mismatches that may result from RDW techniques in which the virtual camera is manipulated, but the body of the user is not. Due to the rise of novel large field of view HMDs, such mismatches between floor and feet get visible much more often compared to setups with smaller fields of view. We will evaluate approaches to mask conflicting body feedback by either manipulating the virtual position and/or the display around critical body parts. We are confident that with the novel approaches we will be able to significantly decrease the space requirements of RDW such that virtual omni-directional walking through arbitrary VEs becomes possible, even if only a small physical walking space is available, such as in a CAVE or in smaller lab environments. 4. Implement adaptive controllers and algorithms for RDW in order to incorporate the novel approaches addressed in the iLUI project and supply a simulator software environment to support testing and comparing of different algorithms. These extensions will be integrated into the V-R-GO library.
DFG Programme
Research Grants
International Connection
China (Hong Kong), France, USA
Cooperation Partners
Professor Eric Hodgson, Ph.D.; Professorin Dr. Victoria Interrante; Dr. Anatole Lecuyer; Professorin Dr. Li Li; Professorin Dr. Tabitha Peck; Professorin Mary Whitton
Co-Investigators
Professor Dr. Gerd Bruder; Professor Dr. Volker Franz; Dr. Harald Frenz