University of Cambridge > Talks.cam > RSE Seminars > Talk Cancelled: Less is more

Talk Cancelled: Less is more

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Jack Atkinson.

Please note that the speaker is no longer available on this date so the talk has been cancelled. We will look at arranging an alternative.

Occam’s razor states “Plurality is never to be posited without necessity.” We begin by examining how small a neural network can distinguish the digits in the MNIST data set. For continuous problems, there are Universal Approximation Theorems. For any function and a criterion of closeness, if there are enough neurons in a neural network, then there exists a neural network with that many neurons that does approximate the function that close. However, is this desirable? Simpler systems facilitate human insight. We look at the following challenges in data approximation also known as inference in machine learning.

  • Ill-conditioned matrices.
  • Modeling data, keeping the model simple while explaining the data adequately.
  • Choice of model space.
  • New data arriving.
  • Model updates.
  • Confidence in model predictions.
  • Informed data collection.
  • Limited computing power, up and down-link capacity.

And introduce a system which builds up complexity when it is necessary.

The zoom link is https://cam-ac-uk.zoom.us/j/81161988457?pwd=TB5DgLyL0RLQROGBA4LC9jLnlKAh5p.1 (Password 355996)

This talk is part of the RSE Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity