COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Partial least squares for dependent data
Partial least squares for dependent dataAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact INI IT. STS - Statistical scalability We consider the linear and kernel partial least squares algorithms for dependent data and study the consequences of ignoring the dependence both theoretically and numerically. For linear partial least squares estimator we derive convergence rates and show that ignoring non-stationary dependence structures can lead to inconsistent estimation. For kernel partial least squares estimator we establish convergence rates under a source and an effective dimensionality conditions. It is shown both theoretically and in simulations that long range dependence results in slower convergence rates. A protein dynamics example illustrates our results and shows high predictive power of partial least squares. This talk is part of the Isaac Newton Institute Seminar Series series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsSidgwick Site Equalities Improvement Network Feminist Classics Revisited Meeting the Challenge of Healthy Ageing in the 21st CenturyOther talksNon-equilibrium turbulence scalings and self-similarity in turbulent planar jets The neural bases of declarative memory and primary using studies of brain damaged patients Cafe Synthetique: Next Generation Synthetic Biologists Inferring Uncertainties in Computational Anatomy Vanishing theorems on globally F-regular varieties Universalizing the Promise of Empire |