Search Results for author: Shalini Shrivastava

Found 2 papers, 0 papers with code

Non-Ideal Program-Time Conservation in Charge Trap Flash for Deep Learning

no code implementations12 Jul 2023 Shalini Shrivastava, Vivek Saraswat, Gayatri Dash, Samyak Chakrabarty, Udayan Ganguly

Training deep neural networks (DNNs) is computationally intensive but arrays of non-volatile memories like Charge Trap Flash (CTF) can accelerate DNN operations using in-memory computing.

Blocking

Software-Level Accuracy Using Stochastic Computing With Charge-Trap-Flash Based Weight Matrix

no code implementations9 Mar 2020 Varun Bhatt, Shalini Shrivastava, Tanmay Chavan, Udayan Ganguly

The in-memory computing paradigm with emerging memory devices has been recently shown to be a promising way to accelerate deep learning.

Q-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.