Scalable Gaussian Processes
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Robert Pinsler.
Inference and learning in models with Gaussian process (GP) priors are well-known to suffer from high computational costs, both in terms of time and memory. Various methods have been proposed to allow GP models to scale to big data. In this reading group, we discuss two popular techniques for scaling up GP models: variational inference and conjugate gradient methods.
This talk is part of the Machine Learning Reading Group @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|