You are here: Home / Research / Bernstein II / Bernstein Virtual Reality Facility

Bernstein Virtual Reality Facility

Despite being multimodally-driven the neural representation and processing of space and time is traditionally investigated separately for different sensory systems. This is mainly due to the fact that until recently multimodal stimulation was hard to accomplish. New developments that make use of computer-generated environments help to overcome these limitations. In such virtual realities it is possible to simultaneously present, e.g., accoustic and visual stimuli. Moreover, virtual reality systems help to overcome the spatial restrictions of typical lab rooms and hence enable experiments with more comprehensive and natural stimulus environments.

Funded by the BMBF a facility for experiments with virtual reality (VR) was installed at the Bernstein Center Munich. The Bernstein Virtual Reality Facility consists of two independent setups, each customized for the generation and presentation of virtual environments to either humans or rodents. Each setup allows the presentation of a visual as well as an acoustical virtual environment that can be linked to the actual movement of the individual inside the virtual environment. An identical software layout for both setups ensures that every audio-visual virtual environment implemented for one of the setups can also be used in the other.

Human Virtual Reality

On the human virtual reality setup the visual virtual environment (VE) is presented via a high-resolution stereoscopic head-mounted display, the acoustic VE is implemented by means of an ambisonic free-field sound scene played back via a loudspeaker array. The test subjects are located in a motorized swivel chair, allowing the use of angular vestibular stimulation. The test subjects can interact with the VE through motion tracking and a joystick mounted onto the swivel chair. Additionally, a 128 channel EEG recording unit is available for usage in combination with the behavioral/psychophysical experiments. All modules of the VR setup (visual, acoustic, vestibular) are independently applicable, allowing a multitude of psychophysical experiments.


Bernstein projects that used the Human VR setup

Project B3 – Auditory invariance against space-induced reverberation. Lutz Wiegrebe (LMU), Benedikt Grothe (LMU), and Christian Leibold (LMU)

Project B5 – Temporal aspects of spatial memory consolidation. Steffen Gais (LMU), Stefan Glasauer (LMU), and Christian Leibold (LMU)

Project B-T3 Proprioceptive stabilization of the auditory world. Paul MacNeilage (LMU), Uwe Firzlaff (TUM), and Lutz Wiegrebe (LMU)

 

Publications

Andrea Lingner, Benedikt Grothe, Lutz Wiegrebe, Stephan D. Ewert: Binaural Glimpses at the Cocktail Party? JARO (2016). doi:10.1007/s10162-016-0575-7

Daria GenzelUwe FirzlaffLutz WiegrebePaul MacNeilage: Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals. 

Ludwig Wallmeier, Lutz Wiegrebe: Ranging in Human Sonar: Effects of Additional Early Reflections and Exploratory Head MovementsPLoS ONE 9(12): e115363. doi:10.1371/journal.pone.0115363

Ludwig WallmeierLutz Wiegrebe: Self-motion facilitates echo-acoustic orientation in humans

Rodent Virtual Reality

Bernstein projects that used the Rodent VR setup

 

Publications

Thurley K and Ayaz A (2016) Virtual reality systems for rodents. Current Zoology 

Kautzky M and Thurley K (2016) Estimation of self-motion duration and distance in rodents. R. Soc. open sci. 3: 160118. 

Garbers C; Henke J; Leibold C; Wachtler T; Thurley K (2015) Contextual processing of brightness and color in Mongolian gerbils. Journal of Vision (2015) 15:1 pii: 13. doi: 10.1167/15.1.13

Thurley K; Henke J; Hermann J; Ludwig B; Tatarau C; Wätzig A; Herz A; Grothe B; Leibold C (2014) Mongolian gerbils learn to navigate in complex virtual spaces. Behav Brain Res (2014) 266: 161-168