Sensory information must be combined across different spatial reference frames and temporal scales. In terms of space, information from each sense is encoded in a particular reference frame, such as the location on the retina for vision. Auditory information is based on head-centered coordinates, while touch informs us only about the local location in which objects contact our skin. Likewise, temporal integration is not a unitary process. Temporal integration of sensory information within the brain takes place at multiple time scales, which can be measured using behavioral tasks and by looking at neural correlates of integration. The goal of this project is to look at how the brain combines information over time and across different senses, using psychophysics, EEG/MEG and/or fMRI methods, in order to better understand our subjective experience of space and time.
Candidates should have demonstrated expertise in programming experiments and in analyzing data. Advanced experience with multisensory stimulation, neuroimaging (EEG/MEG or fMRI) and/or eyetracking is preferred.