no code implementations • 30 May 2023 • Devdhar Patel, Terrence Sejnowski, Hava Siegelmann
We present a temporally layered architecture (TLA) for temporally adaptive control with minimal energy expenditure.
no code implementations • 1 Feb 2023 • Roy Siegelmann, Hava Siegelmann
Trisomy, a form of aneuploidy wherein the cell possesses an additional copy of a specific chromosome, exhibits a high correlation with cancer.
no code implementations • 17 Jan 2023 • Zhongyang Zhang, Kaidong Chai, Haowen Yu, Ramzi Majaj, Francesca Walsh, Edward Wang, Upal Mahbub, Hava Siegelmann, Donghyun Kim, Tauhidur Rahman
As a beloved sport worldwide, dancing is getting integrated into traditional and virtual reality-based gaming platforms nowadays.
no code implementations • 25 Dec 2022 • Devdhar Patel, Hava Siegelmann
However, early exits increase the training time of the neural networks.
no code implementations • 25 Dec 2022 • Devdhar Patel, Joshua Russell, Francesca Walsh, Tauhidur Rahman, Terrence Sejnowski, Hava Siegelmann
Our design is biologically inspired and draws on the architecture of the human brain which executes actions at different timescales depending on the environment's demands.
no code implementations • 13 Dec 2022 • Adam Kohan, Ed Rietman, Hava Siegelmann
In artificial neural networks, weights are a static representation of synapses.
no code implementations • 15 Feb 2022 • Hananel Hazan, Simon Caby, Christopher Earl, Hava Siegelmann, Michael Levin
A common view in the neuroscience community is that memory is encoded in the connection strength between neurons.
no code implementations • NeurIPS 2021 • Stephen Chung, Hava Siegelmann
Previous works have proved that recurrent neural networks (RNNs) are Turing-complete.
no code implementations • The IEEE Winter Conference on Applications of Computer Vision (WACV), 2020 2020 • Prakhar Kaushik, Alex Gain, Hava Siegelmann
We propose Adaptive Neural Connections (ANC), a method for explicitly parameterizing fine-grained neuron-to-neuron connections via adjacency matrices at each layer that are learned through backpropagation.
Ranked #1 on Sparse Learning on ImageNet32
no code implementations • 4 Jun 2019 • Hananel Hazan, Daniel J. Saunders, Darpan T. Sanghavi, Hava Siegelmann, Robert Kozma
Spiking neural networks (SNNs) with a lattice architecture are introduced in this work, combining several desirable properties of SNNs and self-organized maps (SOMs).
no code implementations • ICML 2020 • Alex Gain, Hava Siegelmann
A longstanding problem for Deep Neural Networks (DNNs) is understanding their puzzling ability to generalize well.
3 code implementations • 26 Mar 2019 • Devdhar Patel, Hananel Hazan, Daniel J. Saunders, Hava Siegelmann, Robert Kozma
Previous studies in image classification domain demonstrated that standard NNs (with ReLU nonlinearity) trained using supervised learning can be converted to SNNs with negligible deterioration in performance.
no code implementations • 21 Oct 2017 • Mark Shifrin, Hava Siegelmann
\begin{abstract} We model individual T2DM patient blood glucose level (BGL) by stochastic process with discrete number of states mainly but not solely governed by medication regimen (e. g. insulin injections).
Model-based Reinforcement Learning Reinforcement Learning (RL)
1 code implementation • NeurIPS 2000 • Asa Ben-Hur, David Horn, Hava Siegelmann, Vladimir Vapnik
Data points enclosed by each contour are defined as a cluster.