COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |

University of Cambridge > Talks.cam > Statistics > Assessing the finite-dimensionality of functional data

## Assessing the finite-dimensionality of functional dataAdd to your list(s) Download to your calendar using vCal - Celine Vial (Rennes)
- Friday 30 November 2007, 14:00-15:00
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB.
If you have a question about this talk, please contact S.M.Pitts. If a problem in functional data analysis is low-dimensional then the methodology for its solution can often be reduced to relatively conventional techniques in multivariate analysis. Hence, there is intrinsic interest in assessing the finite-dimensionality of functional data. We show that this problem has several unique features. From some viewpoints the problem is trivial, in the sense that continuously-distributed functional data which are exactly finite-dimensional are immediately recognisable as such, if the sample size is sufficiently large. However, in practice, functional data are almost always observed with noise, for example resulting from rounding or experimental error. Then the problem is almost insolubly difficult. In such cases a part of the average noise variance is confounded with the true signal, and is not identifiable. However, it is possible to define the unconfounded part of the noise variance. This represents the best possible lower bound to all potential values of average noise variance, and is estimable in low-noise settings. Moreover, bootstrap methods can be used to describe the reliability of estimates of unconfounded noise variance, under the assumption that the signal is finite-dimensional. Motivated by these ideas, we suggest techniques for assessing the finiteness of dimensionality. In particular, we show how to construct a critical point $\hat{v}_q$ such that, if the distribution of our functional data has fewer than q – 1 degrees of freedom, then we should be prepared to assume that the average variance of the added noise is at least $\hat{v}_q$ If this level seems too high then we must conclude that the dimension is at least q – 1. We show that simpler, more conventional techniques, based on hypothesis testing, are generally not effective. This talk is part of the Statistics series. ## This talk is included in these lists:- All CMS events
- All Talks (aka the CURE list)
- CMS Events
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- DPMMS Lists
- DPMMS info aggregator
- DPMMS lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- MR12, CMS, Wilberforce Road, Cambridge, CB3 0WB
- Machine Learning
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
- bld31
- custom
- rp587
Note that ex-directory lists are not shown. |
## Other listsComputational and Systems Biology Engineering Department Geotechnical Research Seminars Talk by Les Frères Chapalo## Other talksDescription: Olfaction of biologically relevant vapors by secondary electrospray ionization mass spectrometry Train and equip: British overseas security assistance in the Cold War Global South Religion, revelry and resistance in Jacobean Lancashire Finding alternatives: when circumstances suddenly change Seminar – The Cambridge Sustainable Food Hub 100 Problems around Scalar Curvature Glucagon like peptide-1 receptor - a possible role for beta cell physiology in susceptibility to autoimmune diabetes Computing knot Floer homology Picturing the Heart in 2020 Throwing light on organocatalysis: new opportunities in enantioselective synthesis Dispersion for the wave and the Schrodinger equations outside strictly convex obstacles |