University of Cambridge > > ML@CL Seminar Series > Maximum entropy, uniform measure

Maximum entropy, uniform measure

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Francisco Vargas.


Talk is based on this paper.


We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Rényi entropies of information theory. We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X.

Keywords: Maximum Entropy, Enriched Categories, Size and Magnitude, metric spaces.

About the Speaker:

Emily Roff is a PhD student at the University of Edinburgh, where she is a member of the Geometry and Topology group in the Hodge Institute, working with Tom Leinster. Emily’s research has to do with numerical and homological invariants of metric spaces that derive from an interpretation of a metric space as a type of enriched category. More generally, she is interested in enriched category theory and its applications within and beyond pure mathematics. Prior to Edinburgh Emily completed part III of the mathematical Tripos at Cambridge.


Part of ML@CL Seminar Series focusing on early career researchers in topics relevant to machine learning and statistics.

This talk is part of the ML@CL Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2021, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity