Information Geometry: From Divergence Functions to Geometric Structures
- 👤 Speaker: Jun Zhang, University of Michigan-Ann Arbor
- 📅 Date & Time: Friday 09 May 2014, 16:00 - 17:00
- 📍 Venue: MR12, Centre for Mathematical Sciences, Wilberforce Road, Cambridge
Abstract
Information Geometry is the differential geometric study of the manifold of probability density functions. Divergence functions (such as KL divergence), as measure of proximity on this manifold, play an important role in machine learning, statistical inference, optimization, etc. This talk will review the various geometric structures induced from any divergence function. Most importantly, a Riemannian metric (Fisher information) with a family of torsion-free affine connections (alpha- connections) can be induced on the manifold, this is the so-called the “statistical structure” in Information Geometry. Divergence functions can induce other important structures/quantities, such as bi-orthogonal coordinates (namely expectation and natural parameters), parallel volume form (in modeling Bayesian priors), symplectic structure (for Hamiltonian systems).
Series This talk is part of the Statistics series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- CMS Events
- custom
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Guy Emerson's list
- Hanchen DaDaDash
- Interested Talks
- Machine Learning
- MR12, Centre for Mathematical Sciences, Wilberforce Road, Cambridge
- rp587
- School of Physical Sciences
- Statistical Laboratory info aggregator
- Statistics
- Statistics Group
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Jun Zhang, University of Michigan-Ann Arbor
Friday 09 May 2014, 16:00-17:00