Recurrent Neural Networks inspired by ODEs
Siddhartha Mishra @ (ETH Zurich), T. Konstantin Rusch
Recurrent neural networks (RNNs) are the neural network architecture of choice of learning tasks involving time series inputs and outputs such as natural language processing, speech recognition and time series analysis. It is very challenging to design RNNs that can learn tasks with long-term dependencies while still being expressive i.e., possess the ability of process complex inputs and outputs. We present a suite of RNNs that are inspired by dynamical systems, namely CoRNN (based on nonlinear coupled oscillators), UniCORNN (based on a Hamiltonian system) and LEM (based on a multi-scale ODEs) that are proved to mitigate the exploding and vanishing gradient problem and enable them to deal with tasks with very long-term dependencies. These architectures provide state of the art performance on a wide variety of learning tasks.
pdf version