University of Cambridge > > Statistics > Information Geometry: From Divergence Functions to Geometric Structures

Information Geometry: From Divergence Functions to Geometric Structures

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact .

Information Geometry is the differential geometric study of the manifold of probability density functions. Divergence functions (such as KL divergence), as measure of proximity on this manifold, play an important role in machine learning, statistical inference, optimization, etc. This talk will review the various geometric structures induced from any divergence function. Most importantly, a Riemannian metric (Fisher information) with a family of torsion-free affine connections (alpha- connections) can be induced on the manifold, this is the so-called the “statistical structure” in Information Geometry. Divergence functions can induce other important structures/quantities, such as bi-orthogonal coordinates (namely expectation and natural parameters), parallel volume form (in modeling Bayesian priors), symplectic structure (for Hamiltonian systems).

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity