COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
Bayesian Inference using Generative ModelsAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Qingyuan Zhao. Room changed Variational Inference (e.g. Variational Bayes) can use a variety of approximating densities. Some recent work has explored using classes of Generative Neural Networks with Jacobians that are either volume preserving or fast to calculate. In this work we explore two points: using more general neural networks, but taking advantage of the conditional density structure that arises naturally in a Hierarchical Bayesian model and a general inference framework, in the Spirit of David Spiegelhalter’s WinBugs software, where a wide range of models can be specified and the software ‘automatically’ generates an approximation of the posterior density. This talk is part of the Statistics series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsmp3 music co za Cambridge Centre for Risk Studies FERSA WorkshopsOther talksDetection of an unbounded thin waveguide in the Helmholtz equation in the plane. The Tock Tick of Senescence BSU Seminar: 'Covariate Adjustment in Randomized Experiments with Incomplete Covariate and Outcome Data' Uniqueness and support theorems for analytic double fibration transforms |