BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Stochastic Gradient Piecewise Deterministic Monte Carlo Samplers -
  Paul Fearnhead (Lancaster University)
DTSTART:20241128T100500Z
DTEND:20241128T105500Z
UID:TALK221545@talks.cam.ac.uk
DESCRIPTION:Recently it has been shown that piecewise deterministic Markov
  processes (PDMPs) can be used as an alternative to MCMC. The idea is to s
 imulate a PDMP that has been designed to have the posterior distribution a
 s its stationary distribution\, with there being simple rules for specifyi
 ng the dynamics of the PDMP to enable this. Furthermore\, the PDMP sampler
 s are non-reversible\, and thus can mix better than reversible MCMC in hig
 h-dimensions\, and they can be implemented with sub-sampling ideas to redu
 ce the per-iteration cost. Unfortunately\, whilst the dynamics of these PD
 MPs are easy to define\, simulating a continuous-time realisation is chall
 enging in general. To overcome this\, we will show we can approximately si
 mulate the dynamics of a PDMP with subsampling. The resulting algorithm is
  easy to implement\, and is computationally efficient as it involves just 
 accessing one or two&nbsp\;data points per iteration. The resulting algori
 thm can be viewed as an alternative to the popular stochastic-gradient Lan
 gevin dynamics (SGLD) algorithm\, but with a PDMP replacing the Langevin d
 ynamics. The resulting stochastic gradient PDMP algorithm has a number of 
 advantages over SGLD: the underlying dynamics are non-reversible\; the alg
 orithm is more stable and can have a higher order of accuracy\; and we can
  leverage its continuous trajectories to more naturally incorporate model 
 selection. \nThis is joint work with Sebastiano Grazzi\, Chris Nemeth\, Es
 tevao Prado and Gareth Roberts\n
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
