Search Results for author: Brian D. Hoskins

Found 4 papers, 0 papers with code

Experimental demonstration of a robust training method for strongly defective neuromorphic hardware

no code implementations11 Dec 2023 William A. Borders, Advait Madhavan, Matthew W. Daniels, Vasileia Georgiou, Martin Lueker-Boden, Tiffany S. Santos, Patrick M. Braganca, Mark D. Stiles, Jabez J. McClelland, Brian D. Hoskins

Methods such as hardware-aware training, where substrate non-idealities are incorporated during network training, are one way to recover performance at the cost of solution generality.

Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions

no code implementations16 Dec 2021 Jonathan M. Goodwill, Nitin Prasad, Brian D. Hoskins, Matthew W. Daniels, Advait Madhavan, Lei Wan, Tiffany S. Santos, Michael Tran, Jordan A. Katine, Patrick M. Braganca, Mark D. Stiles, Jabez J. McClelland

The increasing scale of neural networks and their growing application space have produced demand for more energy- and memory-efficient artificial-intelligence-specific hardware.

Memory-efficient training with streaming dimensionality reduction

no code implementations25 Apr 2020 Siyuan Huang, Brian D. Hoskins, Matthew W. Daniels, Mark D. Stiles, Gina C. Adam

The movement of large quantities of data during the training of a Deep Neural Network presents immense challenges for machine learning workloads.

BIG-bench Machine Learning Dimensionality Reduction

Streaming Batch Eigenupdates for Hardware Neuromorphic Networks

no code implementations5 Mar 2019 Brian D. Hoskins, Matthew W. Daniels, Siyuan Huang, Advait Madhavan, Gina C. Adam, Nikolai Zhitenev, Jabez J. McClelland, Mark D. Stiles

Neuromorphic networks based on nanodevices, such as metal oxide memristors, phase change memories, and flash memory cells, have generated considerable interest for their increased energy efficiency and density in comparison to graphics processing units (GPUs) and central processing units (CPUs).

Cannot find the paper you are looking for? You can Submit a new open access paper.