University of Cambridge > Talks.cam > Statistics > Bayesian Inference using Generative Models

Bayesian Inference using Generative Models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Qingyuan Zhao.

Room changed

Variational Inference (e.g. Variational Bayes) can use a variety of approximating densities. Some recent work has explored using classes of Generative Neural Networks with Jacobians that are either volume preserving or fast to calculate. In this work we explore two points: using more general neural networks, but taking advantage of the conditional density structure that arises naturally in a Hierarchical Bayesian model and a general inference framework, in the Spirit of David Spiegelhalter’s WinBugs software, where a wide range of models can be specified and the software ‘automatically’ generates an approximation of the posterior density.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity