Information-Greedy Global Optimisation
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Microsoft Research Cambridge Talks Admins.
This event may be recorded and made available internally or externally via http://research.microsoft.com. Microsoft will own the copyright of any recordings made. If you do not wish to have your image/voice recorded please consider this before attending
Optimisation is about inferring the location of the optimum of a function. An information-optimal optimiser should thus aim to collapse its belief about the location of the optimum towards a point-distribution, as fast as possible. But the state of the art rarely addresses this inference problem. Instead, it usually relies on some heuristic predicting function optima, then evaluates at the maximum of the heuristic. The reason there are no truly probabilistic optimisers yet is that they are intractable in several ways. In this talk, I will present tractable approximations for each of these issues, and arrive at a flexible global optimiser for functions under Gaussian process priors, which empirically outperforms the state of the art.
This talk is part of the Microsoft Research Machine Learning and Perception Seminars series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|