|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Optimal integration of top-down and bottom-up uncertainty in humans, monkeys, and neural networks
If you have a question about this talk, please contact Zoubin Ghahramani.
Hearing the sound of rustling leaves in the forest might indicate the presence of a predator or just be the effect of the wind. There are two difficulties in determining the source of the sound. First, both sources can cause similar sounds, and second, the sound is corrupted by sensory noise. In this common decision paradigm, uncertainty associated with previous knowledge (top-down uncertainty) is combined with uncertainty associated with sensory information from the environment (bottom-up uncertainty). The capability of the brain to make decisions under both types of uncertainty is key for survival. We studied this kind of decision-making in both humans and macaque monkeys using an orientation classification task. Stimuli were oriented patterns whose orientations were drawn from either a narrow or a wide normal distribution, with the same means. Each distribution defined a class. Top-down uncertainty stemmed from the overlap of the two distributions. Bottom-up uncertainty was manipulated through the contrast of the stimulus. The Bayes-optimal observer in this task chooses, on each trial, the class with the highest posterior probability. Computing the posterior is nontrivial because marginalization over stimulus orientation is required. The optimal strategy amounts to using a decision criterion that varies according to the trial-by-trial bottom-up uncertainty. Bayesian model comparison revealed that this model describes human and monkey data better than models with non-optimal criteria. We proceeded to construct a neural network that behaves like the optimal observer under biologically plausible forms of neural variability. Within the framework of Poisson-like population codes, we trained neural networks with different types of operations to approximate the optimal posterior over class. A network with divisive normalization operations was sufficient to perform well in this non-trivial task. Our results demonstrate that humans, monkeys, and neural networks can optimally integrate top-down and bottom-up uncertainty.
This talk is part of the Computational and Biological Learning Seminar Series series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsMental Health, Religion & Culture Pembroke College Corporate Partnership Talks Churchill Undergraduate Physics Seminars
Other talks'Holocaust Anxiety in Philip Roth's Portnoy's Complaint' - Brett Ashley Kaplan Cambridge 3Rs seminar series The role of gibberellins in the control of bolting and flowering in sugar beet Modeling swelling kinetics and transport in hydrogels using mesoscale simulations Simulations of damped Lyman-alpha systems The annual Breathlessness Research Interest Group Open Lecture