University of Cambridge > Talks.cam > Inference Group > The predictive power of contractive neural-network spaces

The predictive power of contractive neural-network spaces

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Phil Cowans.

Prediction power comes from being able to identify similar situations through time and associate with those points the generalised notions of what happens next according to observation. Positive state separability, with respect to the transient dynamics of time series, is that which boosts the differences between temporal artifacts which are precursors and contributors to events of subsequent eminent importance that one would like to predict.

The idea of inducing linear separability of a classification space through forced dimensionality explosion is not new. In the field of recurrent neural networks, realisations of a temporal variant of this paradigm have been cropping up with increasing frequency, with proponents boasting their superior classification power and engineering simplicity.

Some examples are the echo state network (ESN) from Herbert Jaeger, the liquid state machine (LSM) from Wolfgang Maass, and the back-propagation-de-correlation algorithm (BPDC) and associated dynamic network from Jochen Steil.

Whilst these models vary in details, abstractly they are very similar. The idea is simple; project a temporal stream into a high-dimensional space to increase its state-separability, and using a linear or nearly-linear readout mechanism, exploit the enhanced separability to perform prediction and/or classification.

Although the idea is simple, the models tend toward an undesirable degree of “kernel magic” to acheive their satisfying performance; there has yet to be a precise and sufficient qualification of the mechanisms involved.

This talk will expand on the above notions and discuss the one-semester mini-project work experiments which were executed on the hunch that the power of these networks derive principally from fractal contractivity of the recurrent network weight spaces.

This talk is part of the Inference Group series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2020 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity