Particle filters and curse of dimensionality
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Zoubin Ghahramani.
Note: TALK via Skype!
A problem that arises in many applications is to compute the conditional distributions of stochastic models given observed data. While exact computations are rarely possible, particle filtering algorithms have proved to be very useful for approximating such conditional distributions. Unfortunately, the approximation error of particle filters grows exponentially with dimension, a phenomenon known as curse of dimensionality. This fact has rendered particle filters of limited use in complex data assimilation problems that arise, for example, in weather forecasting or multitarget tracking. In this talk I will show that it is possible to develop “local” particle filtering algorithms whose approximation error is dimensionfree. By exploiting conditional decay of correlations properties of highdimensional models, we prove for the simplest possible algorithm of this type an error bound that is uniform both in time and in the model dimension. (Joint work with R. van Handel)
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that exdirectory lists are not shown.
