COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Zangwill Club > See what you hear - Constructing a representation of the world across the senses -
See what you hear - Constructing a representation of the world across the senses -Add to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Louise White. To form a coherent percept of the environment the brain needs to integrate sensory signals from a common source and segregate those from different sources. Human observers have been shown to integrate sensory signals in line with Bayesian Causal Inference by taking into account the uncertainty about the world’s causal structure. Over the past decade, evidence has accumulated that multisensory integration is not deferred to later processing in association cortices but starts already in primary, putatively unisensory, areas. Given this multitude of multisensory integration sites, characterizing their functional similarities and differences is of critical importance. Our research demonstrates that multisensory integration emerges in a functional hierarchy with temporal coincidence detection in primary sensory, informational integration in association and decisional interactions in prefrontal areas. Combining Bayesian modeling, multivariate decoding and EEG /fMRI we show that the brain integrates sensory signals in line with Bayesian Causal Inference by simultaneously encoding multiple perceptual estimates along the cortical hierarchy. Only at the top of the hierarchy, in anterior intraparietal sulcus, at about 300-400 ms the uncertainty about the world’s causal structure is taken into account and sensory signals are combined weighted by their sensory reliabilities and task-relevance as predicted by Bayesian Causal Inference. The intraparietal sulcus arbitrates between signal integration and segregation to guide behavioural choices and motor responses. Uta Noppeney is Professor of Computational Neuroscience and director of the Computational Neuroscience and Cognitive Robotics Centre at the University of Birmingham, UK. She received a degree in medicine (1997, Freiburg University, Germany), a doctorate in medicine (1998, Freiburg University) and a PhD in neuroscience (2004, University College London, UK). After training in neurology at the University Hospital in Aachen and Magdeburg, she conducted neuroscience research at the Wellcome Trust Centre for Neuroimaging, University College London. In 2005, she became research group leader at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany. Her group’s research employs psychophysics, functional imaging (fMRI,M/EEG,TMS) and models of Bayesian inference and learning to better understand the computational operations and neural mechanisms of multisensory perception and earning. This talk is part of the Zangwill Club series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listswomen@CL all Current Issues in Assessment Cambridge Cell Cycle Club Talks Cambridge University Mycological Society CamLing 2010 -- The Sixth Cambridge Postgraduate Conference in Language Research Stokes Society, Pembroke CollegeOther talksMy VM is Lighter (and Safer) than your Container Embedding Musical Codes into an Interactive Piano Composition The Ambonese Rumphius and his inter-island information networks A V HILL LECTURE - The cortex and the hand of the primate: a special relationship Positive definite kernels for deterministic and stochastic approximations of (invariant) functions Making a Crowdsourced Task Attractive: Measuring Workers Pre-task Interactions |