no code implementations • 31 Oct 2019 • Yiwei Fu, Samer Saab Jr, Asok Ray, Michael Hauser
This work proposes a novel neural network architecture, called the Dynamically Controlled Recurrent Neural Network (DCRNN), specifically designed to model dynamical systems that are governed by ordinary differential equations (ODEs).
no code implementations • 26 Jul 2019 • Michael Hauser
This study develops an unsupervised learning algorithm for products of expert capsules with dynamic routing.
no code implementations • 26 Jul 2019 • Michael Hauser
Capsules are the multidimensional analogue to scalar neurons in neural networks, and because they are multidimensional, much more complex routing schemes can be used to pass information forward through the network than what can be used in traditional neural networks.
no code implementations • WS 2019 • Michael Hauser, Evangelos Sariyanidi, Birkan Tunc, Casey Zampella, Edward Brodkin, Robert Schultz, Julia Parish-Morris
Spoken language ability is highly heterogeneous in Autism Spectrum Disorder (ASD), which complicates efforts to identify linguistic markers for use in diagnostic classification, clinical characterization, and for research and clinical outcome measurement.
no code implementations • 11 Feb 2019 • Michael Hauser
It is found that, in this setting, the average scaled perturbation magnitude is roughly inversely proportional to increasing the number of residual blocks, and from this it follows that for sufficiently large residual networks, they are learning a perturbation from identity.
no code implementations • 11 Jun 2018 • Michael Hauser, Sean Gunn, Samer Saab Jr, Asok Ray
This paper deals with neural networks as dynamical systems governed by differential or difference equations.
no code implementations • NeurIPS 2017 • Michael Hauser, Asok Ray
This implies that the network is learning systems of differential equations governing the coordinate transformations that represent the data.