Hyper and structural Markov laws for graphical models
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Konstantina Palla.
My talk will be based on the concept of hyper Markov laws, introduced
by Dawid and Lauritzen (1993):
http://projecteuclid.org/euclid.aos/1176349260
The general idea is to use distributions on the parameters (termed
“laws” in the paper) that have analogous conditional independence
properties to those of the Markov distributions. These commonly arise
in two circumstances: as the sampling distributions of estimators
(e.g. maximum likelihood estimators), and as priors and posteriors for
Bayesian inference. As with message passing algorithms for
marginalisation, we can exploit the conditional independence
properties to perform calculations locally at certain points of
the graph.
Secondly, I’ll introduce my own work on extending these ideas to the
case where the structure of the graph itself is unknown (commonly
called structural learning), by defining what I term structural Markov properties. These characterise an exponential family over the
set of graphs, and form a conjugate prior allowing convenient prior to
posterior updating.
This talk is part of the Machine Learning Reading Group @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|