A Maximum Entropy Perspective on Spectral Dimensionality Reduction
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Zoubin Ghahramani.
Spectral approaches to dimensionality reduction typically reduce the dimensionality of a data set through taking the eigenvectors of a Laplacian or a similarity matrix. Classical multidimensional scaling also makes use of the eigenvectors of a similarity matrix. In this talk we introduce a maximum entropy approach to designing this similarity matrix. The approach is closely related to maximum variance unfolding. Other spectral approaches, e.g. locally linear embeddings, turn out to be also closely related. These methods can be seen as a sparse Gaussian graphical model where correlations between data points (rather than across data features) are specified in the graph. The hope is that this unifying perspective will allow the relationships between these methods to be better understood and will also provide the groundwork for further research.
This talk is part of the Machine Learning @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|