University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Natural gradient descent and variational inference

Natural gradient descent and variational inference

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Yingzhen Li.

We will cover the following topics, time allowing.

1) Knuth covariance 2) A short introduction to Riemannian geometry. 3) The elegant properties of the Fisher-Rao metric and the connection to the VB update. 4) The paper “Approximate Riemannian Conjugate Gradient Learning for Fixed-Form Variational Bayes” JMLR 2010 . Honkela, Raiko, Kuusela, Tornio, Karhunen. 5) The paper “Stochastic variational inference” JMLR 2013 Hoffman, Blei, Wang, Paisley. 6) Time allowing. The paper “Gaussian processes for Big Data” Hensman, Fusi, Lawrence.

There is no need to do background reading, but those wanting to get a flavour of the talk should look at the natural gradient parts of 4) and 5).

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity