University of Cambridge > Talks.cam > Foundation AI > Constrained Neural Flows

Constrained Neural Flows

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Pietro Lio.

https://meet.google.com/cwu-sumr-qzq

Modelling dynamical systems with explicit constraints has become a key focus within the machine learning community, particularly in time series analysis. Neural Ordinary Differential Equations (Neural ODEs) have emerged as a popular approach for modelling continuous-time data. However, the standard NODE framework struggles when faced with explicit constraints critical to many real-world applications. To address this, Stabilized Neural Differential Equations (SNDEs) were developed to incorporate these constraints by modifying the original dynamics. Despite these advancements, SND Es inherit several limitations from Neural ODEs, including challenges with stiff systems, dependence on hyperparameter-sensitive numerical solvers, and inefficiencies in solving the adjoint system for gradient computation. Recently, Neural Flows have been introduced as an alternative, bypassing the need for solvers by directly learning the flow of the underlying system, which simplifies training and inference. In this work, we extend this concept by learning the flow of a constrained dynamical system. Specifically, we split the constrained ODEs, as formulated in SND Es, and employ techniques like Lie-Trotter splitting to combine the flows of the individual ODEs effectively. This approach maintains the benefits of constraint-aware learning while mitigating the solver-related challenges faced by traditional Neural ODEs and SND Es.

This talk is part of the Foundation AI series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity