COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring. |
University of Cambridge > Talks.cam > Inference Group > The predictive power of contractive neural-network spaces
The predictive power of contractive neural-network spacesAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Phil Cowans. Prediction power comes from being able to identify similar situations through time and associate with those points the generalised notions of what happens next according to observation. Positive state separability, with respect to the transient dynamics of time series, is that which boosts the differences between temporal artifacts which are precursors and contributors to events of subsequent eminent importance that one would like to predict. The idea of inducing linear separability of a classification space through forced dimensionality explosion is not new. In the field of recurrent neural networks, realisations of a temporal variant of this paradigm have been cropping up with increasing frequency, with proponents boasting their superior classification power and engineering simplicity. Some examples are the echo state network (ESN) from Herbert Jaeger, the liquid state machine (LSM) from Wolfgang Maass, and the back-propagation-de-correlation algorithm (BPDC) and associated dynamic network from Jochen Steil. Whilst these models vary in details, abstractly they are very similar. The idea is simple; project a temporal stream into a high-dimensional space to increase its state-separability, and using a linear or nearly-linear readout mechanism, exploit the enhanced separability to perform prediction and/or classification. Although the idea is simple, the models tend toward an undesirable degree of “kernel magic” to acheive their satisfying performance; there has yet to be a precise and sufficient qualification of the mechanisms involved. This talk will expand on the above notions and discuss the one-semester mini-project work experiments which were executed on the hunch that the power of these networks derive principally from fractal contractivity of the recurrent network weight spaces. This talk is part of the Inference Group series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsCambridge Immunology Physics of Medicine (PoM) Seminar Series Emerge CambridgeOther talksCrowding and the disruptive effect of clutter throughout the visual system Optimising the definition of MR-based lung imaging biomarkers Using Inclusive Design to Focus on User Experience (UX) Reconciling centennial-scale climate variation during the last millennium in reconstructions and simulations Loss and damage: Insights from the front lines in Bangladesh Solving the Reproducibility Crisis "Mechanosensitive regulation of cancer epigenetics and pluripotency" Computing High Resolution Health(care) Vision Journal Club: feedforward vs back in figure ground segmentation To be confirmed Cyclic Peptides: Building Blocks for Supramolecular Designs Project Management |