1 code implementation • NeurIPS 2019 • Georgios Detorakis, Sourav Dutta, Abhishek Khanna, Matthew Jerry, Suman Datta, Emre Neftci
Multiplicative stochasticity such as Dropout improves the robustness and generalizability of deep neural networks.
no code implementations • 12 Aug 2019 • Insik Yoon, Matthew Jerry, Suman Datta, Arijit Raychowdhury
In this letter, we quantify the impact of device limitations on the classification accuracy of an artificial neural network, where the synaptic weights are implemented in a Ferroelectric FET (FeFET) based in-memory processing architecture.
no code implementations • 16 Aug 2017 • Abhinav Parihar, Matthew Jerry, Suman Datta, Arijit Raychowdhury
This motivates the current article where we demonstrate a stochastic neuron using an insulator-metal-transition (IMT) device, based on electrically induced phase-transition, in series with a tunable resistance.