University of Cambridge > > Machine Learning @ CUED > Matrix Means, Distances, Kernels, and Geometric Optimization

Matrix Means, Distances, Kernels, and Geometric Optimization

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

I will talk about a new distance function on hermitian positive definite (hpd) matrices that takes inspiration from both differential geometry and convex optimization. This distance function originally arose in a computer-vision application, but since then it took on a life of its own that is interesting enough to talk about.

In my talk, I will briefly describe the original application, followed by key results and wider mathematical connections of this distance function, e.g., to the theory of kernel functions on symmetric spaces; matrix theory; nonlinear matrix equations; symmetric polynomials in noncommutative variables, quantum information theory, and the expanding area of geometric optimization with hpd matrices. I will also mention a few challenging open problems.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity