TBA
- đ¤ Speaker: Mengdi Wang (Princeton University)
- đ Date & Time: Thursday 18 July 2024, 16:00 - 17:00
- đ Venue: External
Abstract
Mirror descent is a primal-dual convex optimization method that can be tailored to the geometry of the optimization problem at hand through the choice of a strongly convex potential function. It arises as a basic primitive in a variety of applications, including large-scale optimization, machine learning, and control. We propose a variational formulation of mirror descent and of its most straightforward stochastic analogue, mirror Langevin dynamics. The main idea leverages variational principles for gradient flows to show that (1) mirror descent emerges as a closed-loop solution for a certain optimal control problem; and (2) the Bellman value function is given by the Bregman divergence between the initial condition and the global minimizer of the objective function.
Series This talk is part of the Isaac Newton Institute Seminar Series series.
Included in Lists
- All CMS events
- bld31
- dh539
- External
- Featured lists
- INI info aggregator
- Isaac Newton Institute Seminar Series
- School of Physical Sciences
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Mengdi Wang (Princeton University)
Thursday 18 July 2024, 16:00-17:00