University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > From learning differential operators to learning algorithms

From learning differential operators to learning algorithms

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nobody.

RCL - Representing, calibrating & leveraging prediction uncertainty from statistics to machine learning

Most scientific and engineering challenges can be organized along a complexity ladder. Right above interpolation lies the learning of differential operators and their solution operators, an area where Gaussian Process/Kernel methods, come with rigorous guarantees and achieve SOTA in terms of data-efficiency and robustness. This talk then ascends to the ladder’s current frontier: algorithm synthesis. Here we introduce a computational‑language–processing framework that tokenizes low‑level computational actions and uses an ensemble‑based Monte‑Carlo Tree Search combined with reinforcement learning to assemble algorithms tailored to individual problem instances.  We conclude by discussing where this ladder is taking us. The first part of this talk is a joint work with based on joint work with Yasamin Jalalian, Juan Felipe Osorio Ramirez, Alexander Hsu, and Bamdad Hosseini. The second part is joint work with Theo Bourdais, Abeynaya Gnanasekaran and Tuhin Sahai.  

This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity