University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > TBA

TBA

Add to your list(s) Download to your calendar using vCal

  • UserMengdi Wang (Princeton University)
  • ClockThursday 18 July 2024, 16:00-17:00
  • HouseExternal.

If you have a question about this talk, please contact nobody.

DMLW01 - International workshop on diffusions in machine learning: foundations, generative models, and optimisation

Mirror descent is a primal-dual convex optimization method that can be tailored to the geometry of the optimization problem at hand through the choice of a strongly convex potential function. It arises as a basic primitive in a variety of applications, including large-scale optimization, machine learning, and control. We propose a variational formulation of mirror descent and of its most straightforward stochastic analogue, mirror Langevin dynamics. The main idea leverages variational principles for gradient flows to show that (1) mirror descent emerges as a closed-loop solution for a certain optimal control problem; and (2) the Bellman value function is given by the Bregman divergence between the initial condition and the global minimizer of the objective function.

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity