University of Cambridge > Talks.cam > ML@CL Seminar Series > GIBBON: General-purpose Information-Based Bayesian OptimisatioN

GIBBON: General-purpose Information-Based Bayesian OptimisatioN

Add to your list(s) Download to your calendar using vCal

  • UserHenry Moss, Senior ML Researcher at Secondmind
  • ClockTuesday 08 February 2022, 15:00-16:00
  • HouseSS03, Computer Lab.

If you have a question about this talk, please contact Andrei Paleyes.

Bayesian optimisation (BO) is a popular routine for optimising functions that are plagued by high evaluation costs. In BO, an acquisition function is used to sequentially focus evaluations into promising areas of the search space, allowing the identification of competitively good solutions within even heavily restricted evaluation budgets. A recent particularly intuitive and empirically effective class of acquisition functions has arisen based around information theory. During this talk, I will motivate these powerful entropy-based methods, before describing a general-purpose extension built around a simple approximation of information gain—an information-theoretic quantity central to solving a range of BO problems, including noisy, multi-fidelity and batch optimisations. Previously, these tasks have been tackled separately within information-theoretic BO, each requiring a different sophisticated approximation scheme, except for batch BO, for which no computationally-lightweight information-theoretic approach has previously been proposed. GIBBON (General-purpose Information-Based Bayesian OptimisatioN) provides a single principled framework suitable for all the above, out-performing existing approaches whilst incurring substantially lower computational overheads.

This talk is part of the ML@CL Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2022 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity