You need to be logged in to carry this out. If you don't have an account, feel free to create one. |
COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
Talk Cancelled: Less is moreAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Jack Atkinson. Please note that the speaker is no longer available on this date so the talk has been cancelled. We will look at arranging an alternative. Occam’s razor states “Plurality is never to be posited without necessity.” We begin by examining how small a neural network can distinguish the digits in the MNIST data set. For continuous problems, there are Universal Approximation Theorems. For any function and a criterion of closeness, if there are enough neurons in a neural network, then there exists a neural network with that many neurons that does approximate the function that close. However, is this desirable? Simpler systems facilitate human insight. We look at the following challenges in data approximation also known as inference in machine learning.
And introduce a system which builds up complexity when it is necessary. The zoom link is https://cam-ac-uk.zoom.us/j/81161988457?pwd=TB5DgLyL0RLQROGBA4LC9jLnlKAh5p.1 (Password 355996) This talk is part of the RSE Seminars series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsVision Science Journal Club cripsr Clare HallOther talksNodal curves and ALE spaces Visual Digital Twins of Forests Curved translation principle in generalized conformal calculus Dispersionless integrable equations and related integrable hierarchies Non-local Colloidal Diffusiophoresis in Crossed Salt Gradients: An 'action at a distance' Effect Predicted by the Nernst-Planck Equations Unveiling the Diversity-Stability Bound in Large Ecological Communities |