BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Optimal score estimation via empirical Bayes smoothing - Andre Wib
 isono (Yale University)
DTSTART:20240715T150000Z
DTEND:20240715T160000Z
UID:TALK219025@talks.cam.ac.uk
DESCRIPTION:We study the problem of estimating the score function of an un
 known probability distribution $\\rho^*$ from $n$ independent and identica
 lly distributed observations in $d$ dimensions. Assuming that $\\rho^*$ is
  subgaussian and has a Lipschitz-continuous score function $s^*$\, we esta
 blish the optimal rate of $\\tilde \\Theta(n^{-\\frac{2}{d+4}})$ for this 
 estimation problem under the loss function $\\|\\hat s - s^*\\|^2_{L^2(\\r
 ho^*)}$ that is commonly used in the score matching literature\, highlight
 ing the curse of dimensionality where sample complexity for accurate score
  estimation grows exponentially with the dimension $d$. Leveraging key ins
 ights in empirical Bayes theory as well as a new convergence rate of smoot
 hed empirical distribution in Hellinger distance\, we show that a regulari
 zed score estimator based on a Gaussian kernel attains this rate\, shown o
 ptimal by a matching minimax lower bound. We also discuss the implication 
 of our theory on the sample complexity of score-based generative models. J
 oint work with Yihong Wu and Kaylee Yang.\n&nbsp\;
LOCATION:External
END:VEVENT
END:VCALENDAR
