no code implementations • 24 Apr 2024 • Osama Yousuf, Brian Hoskins, Karthick Ramu, Mitchell Fream, William A. Borders, Advait Madhavan, Matthew W. Daniels, Andrew Dienstfrey, Jabez J. McClelland, Martin Lueker-Boden, Gina C. Adam
Results demonstrate that by trading off the number of devices required for layer mapping, layer ensemble averaging can reliably boost defective memristive network performance up to the software baseline.
no code implementations • 5 Mar 2023 • Adam N. McCaughan, Bakhrom G. Oripov, Natesh Ganesh, Sae Woo Nam, Andrew Dienstfrey, Sonia M. Buckley
We present multiplexed gradient descent (MGD), a gradient descent framework designed to easily train analog or digital neural networks in hardware.
no code implementations • 29 Nov 2022 • Osama Yousuf, Imtiaz Hossen, Matthew W. Daniels, Martin Lueker-Boden, Andrew Dienstfrey, Gina C. Adam
Data-driven modeling approaches such as jump tables are promising techniques to model populations of resistive random-access memory (ReRAM) or other emerging memory devices for hardware neural network simulations.
no code implementations • 23 Aug 2021 • Zachary Grey, Susanna Mosleh, Jacob Rezac, Yao Ma, Jason Coder, Andrew Dienstfrey
We perform an exploratory analysis of coexistence behavior by approximating active subspaces to identify low-dimensional structure in the optimization criteria, i. e., few linear combinations of parameters for simultaneously maximizing LAA and Wi-Fi throughputs.