Search Results for author: Michael Hauser

Found 7 papers, 0 papers with code

A Dynamically Controlled Recurrent Neural Network for Modeling Dynamical Systems

no code implementations31 Oct 2019 Yiwei Fu, Samer Saab Jr, Asok Ray, Michael Hauser

This work proposes a novel neural network architecture, called the Dynamically Controlled Recurrent Neural Network (DCRNN), specifically designed to model dynamical systems that are governed by ordinary differential equations (ODEs).

Training products of expert capsules with mixing by dynamic routing

no code implementations26 Jul 2019 Michael Hauser

This study develops an unsupervised learning algorithm for products of expert capsules with dynamic routing.

Training capsules as a routing-weighted product of expert neurons

no code implementations26 Jul 2019 Michael Hauser

Capsules are the multidimensional analogue to scalar neurons in neural networks, and because they are multidimensional, much more complex routing schemes can be used to pass information forward through the network than what can be used in traditional neural networks.

Using natural conversations to classify autism with limited data: Age matters

no code implementations WS 2019 Michael Hauser, Evangelos Sariyanidi, Birkan Tunc, Casey Zampella, Edward Brodkin, Robert Schultz, Julia Parish-Morris

Spoken language ability is highly heterogeneous in Autism Spectrum Disorder (ASD), which complicates efforts to identify linguistic markers for use in diagnostic classification, clinical characterization, and for research and clinical outcome measurement.

BIG-bench Machine Learning

On Residual Networks Learning a Perturbation from Identity

no code implementations11 Feb 2019 Michael Hauser

It is found that, in this setting, the average scaled perturbation magnitude is roughly inversely proportional to increasing the number of residual blocks, and from this it follows that for sufficiently large residual networks, they are learning a perturbation from identity.

State Space Representations of Deep Neural Networks

no code implementations11 Jun 2018 Michael Hauser, Sean Gunn, Samer Saab Jr, Asok Ray

This paper deals with neural networks as dynamical systems governed by differential or difference equations.

Principles of Riemannian Geometry in Neural Networks

no code implementations NeurIPS 2017 Michael Hauser, Asok Ray

This implies that the network is learning systems of differential equations governing the coordinate transformations that represent the data.

Cannot find the paper you are looking for? You can Submit a new open access paper.