This site will be unavailable on 16 April from 08:00–17:00 for content migration to the new talks.cam site. For more information, visit the UIS Help Site
 

University of Cambridge > Talks.cam > Foundation AI > Forward Pass as Heat Flow

Forward Pass as Heat Flow

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Pietro Lio.

Strong machine learning models have demonstrated a remarkable ability to leverage the underlying geometric and topological structure of datasets. This has been observed not just in explicitly geometric domains (such as graph or mesh-based data), but even when this underlying structure is implicit (eg satisfies the manifold hypothesis). In this talk, we shall explore the unifying perspective that both regimes may be understood as performing heat diffusion intrinsic to the underlying geometry in the model’s forward pass. As examples of this philosophy, we will discuss a far-reaching generalization of the convergence results of Belkin-Niyogi that unites several geometric deep learning architectures as well as a manifold-theoretic framework underlying the ‘emergent’ ability of in-context learning in large models.

This talk is part of the Foundation AI series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2026 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity