University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Continual Learning: Definitions, Benchmarks, and Approaches

Continual Learning: Definitions, Benchmarks, and Approaches

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact jg801.

Continual learning is a field that brings together many areas of research, including online learning, transfer learning, multi-task learning, meta learning and few-shot learning. In broad terms, in continual learning, we see data sequentially (we are not allowed to revisit old data), and we desire good performance across all tasks observed so far. This general learning setting is one step towards bridging the gap between machine and human learning. One particular challenge for modern continual learning is the tendency of neural networks to catastrophically forget old data when new training data is introduced. Despite a lot of recent interest in continual learning, a precise definition of the field remains elusive; this is reflected in how recent papers tend to have their own desiderata and benchmarks to test how well their own approach performs.

In this talk, we will focus on bringing together ideas from many papers in order to motivate and get a list of desiderata that summarises continual learning. We will then critically examine current benchmarks used in continual learning: many of them hide potential flaws by only testing a few of continual learning’s desiderata. Finally, we will lay out the space of current continual learning approaches, and look at a few of the state-of-the-art approaches [1, 2, 3, 4].

[1] C. V. Nguyen, Y. Li, T. D. Bui, and R. E. Turner. “Variational continual learning”, ICLR 2018 . [2] J. Kirkpatrick et al. “Overcoming catastrophic forgetting in neural networks”, Proceedings of the national academy of sciences, 114(13):3521–3526, 2017. [3] A. A. Rusu et al. “Progressive neural networks”, arXiv preprint arXiv:1606.04671, 2016. [4] J. Schwarz et al. “Progress & compress: A scalable framework for continual learning”, ICML 2018 .

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity