Skip to main content

Multisensory perception and learning

Principal Supervisor: Professor Uta Noppeney - School of Psychology

Co-supervisor: Peter Tino - Computer Science

PhD project title: Multisensory perception and learning

University of Registration: University of Birmingham

Project outline:

The goal of the project is to unravel the neural mechanisms and computational operations that enable the human brain to interact effectively with its dynamic multisensory environment.

Imagine you are at a funfair. Your senses are constantly bombarded with many different signals: buzzing sounds, flashing lights, the smell of chestnuts. How does the brain make sense of this cacophony? How does it solve the puzzle of the senses?

To form a coherent and reliable percept of the world, the brain needs to integrate noisy sensory inputs coming from a common source, but segregate those from different sources. Further, intrinsic prior knowledge and knowledge gained from past experiences helps to interpret the incoming sensory signals. Within the cortical hierarchy, multisensory perception emerges in an interactive process with top-down prior information constraining the interpretation of the incoming sensory signals.

Potential projects will address one of the following research questions:

  • Is information integration automatic or dependent on attention, awareness and vigilance (e.g. sleep)? Can visual signals (e.g. flash) that we are not aware of alter how and where we perceive sounds? Can the brain integrate sensory signals even when it is asleep or learn to associate new sensory information?
  • What are the computational operations and neural mechanisms by which sensory signals are combined at multiple levels of the cortical hierarchy? Are information integration processes governed by distinct computational principles across the cortical hierarchy?
  • How does the brain arbitrate between information integration and segregation? How does it implement Bayesian Causal Inference at the neural systems level? How does Bayesian Causal Inference evolve dynamically within a trial at millisecond resolution?
  • How does multisensory integration dynamically adapt to the statistical structure of our dynamic environment at multiple timescales ranging from seconds (e.g. rapid contextual changes) to years (e.g. neurodevelopment, ageing)?
  • How does ageing affect information integration and segregation? Can multisensory integration and learning help to compensate for age-related changes?

To address these questions, we will employ a challenging multidisciplinary approach combining the complementary strengths of

  • Psychophysics and signal detection models
  • Functional magnetic resonance imaging at 3T and 7T to characterize the brain activations at the neural systems level and in particular areas at spatial sub-millimeter resolution to define layer-dependent activation profiles
  • Magneto- or Electroencephalography to define the neural dynamics at millisecond resolution. We will characterize the data in terms of evoked responses and timefrequency analyses.
  • Functional perturbation approaches such as concurrent TMS-fMRI (i.e. transcranial magnetic stimulation) and tACS (alternating current stimulation) to causally manipulate the neural dynamics selectively at a particular time and/or frequency. Can we selectively tune the brain to integrate or segregate sensory signals?
  • Further, we will use advanced machine learning and signal processing approaches to characterize the response properties and the information content (e.g. advanced multivariate pattern analysis, representational similarity analyses) of individual regions and establish the functional and effective connectivity between regions (e.g. phase locking indices, Dynamic Causal Modelling).
  • To gain a more informed perspective on the underlying computational and neural mechanisms, we will combine functional imaging with statistical models of Bayesian inference and learning.

References:

  1. Rohe T, Noppeney U (2015) Cortical hierarchies perform Bayesian Causal Inference for multisensory perception. PLOS Biology. 13(2):e1002073.
  2. Deroy O, Spence C, Noppeney U (in press) Metacognition in multisensory perception. Trends in Cognitive Sciences.
  3. Rohe T, Noppeney U (2016) Distinct computational principles govern information integration in primary sensory and association cortices. Current Biology. 26(4):509-14. doi: 10.1016/j.cub.2015.12.056.
  4. Leitão J, Thielscher A, Tünnerhoff J, Noppeney U (2015) Concurrent TMS-fMRI reveals interactions between dorsal and ventral attentional systems. J Neurosci. 35(32):11445-57.
  5. Lee HL, Noppeney U (2014) Temporal prediction errors in auditory and visual cortices. Current Biology. 24 (8): R309-10.
  6. Shen Y, Mayhew SD, Kourtzi Z, Tiňo P. (2014) Spatial-temporal modelling of fMRI data through spatially regularized mixture of hidden process models. Neuroimage;84:657-71.

BBSRC Strategic Research Priority: Molecules, cells and systems

Techniques that will be undertaken during the project:

The student will be trained in at least three techniques of the following:

  1. Functional magnetic resonance imaging (fMRI at 3T and 7T)
  2. Electro- or magnetoencephalography (EEG, MEG)
  3. Neural perturbation (i.e. stimulation) methods (TMS, tACS)
  4. Advanced statistical analyses (e.g. largescale multivariate pattern analyses, e.g. support vector machines)
  5. Computational Modelling: Bayesian Inference and Learning

Contact:  Professor Uta Noppeney, University of Birmingham