University of Cambridge > Talks.cam > Machine Learning Reading Group @ CUED > Neural Ordinary Differential Equations

Neural Ordinary Differential Equations

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact jg801.

Residual connections have enabled SOTA performance on ImageNet and the training of extremely deep neural networks (1000+ layers). They can be thought of as modeling discrete time changes in the hidden units, i.e. h_l+1 – h_l = F(h_l, W). Neural ordinary differential equations (Neural ODEs) consider a continuous time representation of ResNets. That is, the ResNet parametrizes the instantaneous change dh/dt = F(h_l, t), where time t now plays a role akin to network depth. How is such a network useful? How would model fitting and optimization be done? In this meeting of the reading group, we will explain the Neural ODE model and training algorithm proposed by Chen et al. (NeurIPS 2018, Oral). Additionally, we will give overviews of their applications to time series and generative modeling.

This talk is part of the Machine Learning Reading Group @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity