University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Modern Neural Networks: the Hinton Camp

Modern Neural Networks: the Hinton Camp

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Konstantina Palla.

Historically, neural networks with multiple hidden layers have been avoided because they are difficult to train. For example, training the networks with back-propagation yielded disappointing results, which were often worse than those obtained using shallower models. In this RCC we will present an alternative approach to training neural networks – a form of deep learning developed by Geoffrey Hinton. The main idea is to train deep generative models layer-by-layer in a greedy unsupervised fashion making use of a theoretical connection with Restrictive Boltzmann Machines (RBM’s). This network can then be augmented with an additional layer to perform classification, which gives a greatly increased performance over older training methods. We will focus on showing the need for deep models, describing practical algorithms, and unpacking some of the theoretical analogies used to justify the design choices.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity