Multimodal Interactions
Combining vision, audition, olfaction, and touch allows the brain to form a unified representation of the outside world that goes beyond simply summing up the different sensory modalities separately. As different senses reach the brain with different external and internal delays, the nervous system must bring the different signals into spatio-temporal register. A whole range of multimodal interactions is the subject of study in the subtheme Multimodal Interactions, spanning both biological and man-made systems. Comparing the individual strategies to combine information from different modalities (including strategies to deal with the respective uncertainties and potential cross-modal inconsistencies) helps to reveal general principles underlying neuronal representations of space-time. These investigations profit substantially from the newly established Bernstein Virtual Reality Facility with its two parallel setups for rodent neurophysiology and human psychophysics.
Related Projects