no code implementations • 11 Dec 2023 • William A. Borders, Advait Madhavan, Matthew W. Daniels, Vasileia Georgiou, Martin Lueker-Boden, Tiffany S. Santos, Patrick M. Braganca, Mark D. Stiles, Jabez J. McClelland, Brian D. Hoskins
Methods such as hardware-aware training, where substrate non-idealities are incorporated during network training, are one way to recover performance at the cost of solution generality.
no code implementations • 4 Apr 2022 • Axel Hoffmann, Shriram Ramanathan, Julie Grollier, Andrew D. Kent, Marcelo Rozenberg, Ivan K. Schuller, Oleg Shpyrko, Robert Dynes, Yeshaiahu Fainman, Alex Frano, Eric E. Fullerton, Giulia Galli, Vitaliy Lomakin, Shyue Ping Ong, Amanda K. Petford-Long, Jonathan A. Schuller, Mark D. Stiles, Yayoi Takamura, Yimei Zhu
Neuromorphic computing approaches become increasingly important as we address future needs for efficiently processing massive amounts of data.
no code implementations • 16 Dec 2021 • Jonathan M. Goodwill, Nitin Prasad, Brian D. Hoskins, Matthew W. Daniels, Advait Madhavan, Lei Wan, Tiffany S. Santos, Michael Tran, Jordan A. Katine, Patrick M. Braganca, Mark D. Stiles, Jabez J. McClelland
The increasing scale of neural networks and their growing application space have produced demand for more energy- and memory-efficient artificial-intelligence-specific hardware.
no code implementations • 6 Dec 2021 • Nitin Prasad, Prashansa Mukim, Advait Madhavan, Mark D. Stiles
Simulations of complex-valued Hopfield networks based on spin-torque oscillators can recover phase-encoded images.
no code implementations • 25 Apr 2020 • Siyuan Huang, Brian D. Hoskins, Matthew W. Daniels, Mark D. Stiles, Gina C. Adam
The movement of large quantities of data during the training of a Deep Neural Network presents immense challenges for machine learning workloads.
no code implementations • 5 Mar 2019 • Brian D. Hoskins, Matthew W. Daniels, Siyuan Huang, Advait Madhavan, Gina C. Adam, Nikolai Zhitenev, Jabez J. McClelland, Mark D. Stiles
Neuromorphic networks based on nanodevices, such as metal oxide memristors, phase change memories, and flash memory cells, have generated considerable interest for their increased energy efficiency and density in comparison to graphics processing units (GPUs) and central processing units (CPUs).