no code implementations • 24 Apr 2024 • Osama Yousuf, Brian Hoskins, Karthick Ramu, Mitchell Fream, William A. Borders, Advait Madhavan, Matthew W. Daniels, Andrew Dienstfrey, Jabez J. McClelland, Martin Lueker-Boden, Gina C. Adam
Results demonstrate that by trading off the number of devices required for layer mapping, layer ensemble averaging can reliably boost defective memristive network performance up to the software baseline.
no code implementations • 24 Apr 2024 • Joseph A. Kilgore, Jeffrey D. Kopsick, Giorgio A. Ascoli, Gina C. Adam
Spiking neural networks drawing inspiration from biological constraints of the brain promise an energy-efficient paradigm for artificial intelligence.
no code implementations • 29 Nov 2022 • Osama Yousuf, Imtiaz Hossen, Matthew W. Daniels, Martin Lueker-Boden, Andrew Dienstfrey, Gina C. Adam
Data-driven modeling approaches such as jump tables are promising techniques to model populations of resistive random-access memory (ReRAM) or other emerging memory devices for hardware neural network simulations.
no code implementations • 25 Apr 2020 • Siyuan Huang, Brian D. Hoskins, Matthew W. Daniels, Mark D. Stiles, Gina C. Adam
The movement of large quantities of data during the training of a Deep Neural Network presents immense challenges for machine learning workloads.
no code implementations • 5 Mar 2019 • Brian D. Hoskins, Matthew W. Daniels, Siyuan Huang, Advait Madhavan, Gina C. Adam, Nikolai Zhitenev, Jabez J. McClelland, Mark D. Stiles
Neuromorphic networks based on nanodevices, such as metal oxide memristors, phase change memories, and flash memory cells, have generated considerable interest for their increased energy efficiency and density in comparison to graphics processing units (GPUs) and central processing units (CPUs).