University of Cambridge > Talks.cam > Machine Learning @ CUED > Scalable Gaussian Processes for Scientific Discovery

Scalable Gaussian Processes for Scientific Discovery

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact dsu21.

Large datasets provide unprecedented opportunities to automatically discover rich statistical structure, from which we can derive new scientific discoveries. Gaussian processes are flexible distributions over functions, which can learn interpretable structure through covariance kernels. In this talk, I introduce an O(N) Gaussian process framework which is capable of learning expressive kernel functions on large datasets. This framework generalizes and provides alternative derivations for classical inducing point methods, and allows one to exploit kernel structure for significant further gains in scalability and accuracy, without requiring severe assumptions. I evaluate this approach for kernel matrix reconstruction, kernel learning, time series modeling, image inpainting, and long range forecasting in spatiotemporal statistics.

References:

http://www.cs.cmu.edu/~andrewgw/pattern

http://jmlr.org/proceedings/papers/v37/wilson15.pdf

http://jmlr.org/proceedings/papers/v37/wilson15-supp.pdf

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity