Local and global synaptic credit assignment
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Rodrigo Echeveste.
For animals to learn to interact with their environments millions of synapses across several brain areas need to be modified appropriately. Understanding how the brain assigns credit and modifies a particular synapse is a long standing problem in neuroscience.
In this talk I will start by describing our work on a theoretical framework of how synapses may (locally) assign credit to their response parameters to achieve a desired target. Such target responses may, in turn, be computed by gradient descent algorithms that compare the network output with the desired outcome. In machine learning, the backpropagation algorithm is the prime example of one such solution which solves the global synaptic credit assignment problem. I will finish by discussing recent work proposing that cortical microcircuits with dendritic properties may indeed approximate backprop. Overall, our work provides a fresh perspective on how the brain may solve the synaptic credit assignment problem.
This talk is part of the Computational Neuroscience series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|