University of Cambridge > Talks.cam > Computational Neuroscience > Neural circuit mechanisms of learning and attentional task-switching during visually-guided behaviour in mice

Neural circuit mechanisms of learning and attentional task-switching during visually-guided behaviour in mice

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Rodrigo Echeveste.

Abstract: We found that neural responses in the mouse primary visual cortex (V1) become increasingly selective for relevant visual input, by repeatedly imaging cells using 2‐photon calcium imaging while mice learned a visual discrimination task (Poort et al., 2015, Neuron). However, it is unclear how learning reorganises the activity of different cell types, including excitatory pyramidal neurons and different classes of GAB Aergic interneurons. Although pyramidal cells provide the output from the local circuit to other cortical areas, different interneuron classes inhibit pyramidal cells as well as each other, and exert a powerful influence on circuit activity. We therefore simultaneously measured responses in V1 of pyramidal cells and different interneuron types. We find that learning leads to changes in the selectivity and co‐activation patterns across multiple cell classes, and that increased stimulus‐specific inhibition, especially in parvalbumin cells, can contribute to selective processing of relevant objects (Khan et al., 2018, Nature Neuroscience). To determine whether these changes were specific to learning, we trained the same mice to switch between a visual and an olfactory discrimination task to compare neural responses when animals were attending or ignoring the same visual stimuli. We found that effects of learning and task-switching on the response selectivity of the same cells were largely uncorrelated. Learning and task-switching also differentially affected the interactions between different cell classes. These results suggest there are distinct mechanisms underlying increased discriminability of relevant sensory stimuli across longer and shorter time scales. In recent work, we started to extend our experiments to freely moving mice in more complex and natural environments. One challenge for visual neuroscience is that it was so far not possible to track detailed aspects of eye and head movements. We recently published a new and open-source method for head-mounted video tracking in mice, which we combined with motion sensors to measure head movement and multielectrode electrophysiological recordings in visual cortex (Meyer et al., Neuron 2018). I will present preliminary results that indicate how this method may help to understand visually-guided behaviour in freely behaving mice.

This talk is part of the Computational Neuroscience series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity