no code implementations • 17 Jan 2025 • Bradley H. Theilman, James B. Aimone
We demonstrate that scalable neuromorphic hardware can implement the finite element method, which is a critical numerical method for engineering and scientific discovery.
no code implementations • 1 Nov 2024 • Karan P. Patel, Andrew Maicke, Jared Arzate, Jaesuk Kwon, J. Darby Smith, James B. Aimone, Jean Anne C. Incorvia, Suma G. Cardwell, Catherine D. Schuman
Novel devices and novel computing paradigms are key for energy efficient, performant future computing systems.
no code implementations • 11 Dec 2023 • Craig M. Vineyard, William M. Severa, James B. Aimone
In particular, we consider the interplay between algorithm and architecture advances in the field of neuromorphic computing.
no code implementations • 21 Nov 2023 • James B. Aimone, William Severa, J. Darby Smith
Probabilistic artificial neural networks offer intriguing prospects for enabling the uncertainty of artificial intelligence methods to be described explicitly in their function; however, the development of techniques that quantify uncertainty by well-understood methods such as Monte Carlo sampling has been limited by the high costs of stochastic sampling on deterministic computing hardware.
no code implementations • 5 Oct 2023 • Dhireesha Kudithipudi, Anurag Daram, Abdullah M. Zyarah, Fatima Tuz Zohora, James B. Aimone, Angel Yanguas-Gil, Nicholas Soures, Emre Neftci, Matthew Mattina, Vincenzo Lomonaco, Clare D. Thiem, Benjamin Epstein
Lifelong learning - an agent's ability to learn throughout its lifetime - is a hallmark of biological learning systems and a central challenge for artificial intelligence (AI).
no code implementations • 29 Jun 2023 • Bradley H. Theilman, Felix Wang, Fred Rothganger, James B. Aimone
A satisfactory understanding of information processing in spiking neural networks requires appropriate computational abstractions of neural activity.
no code implementations • 5 Oct 2022 • Bradley H. Theilman, Yipu Wang, Ojas D. Parekh, William Severa, J. Darby Smith, James B. Aimone
By designing circuits and algorithms that make use of randomness similarly to natural brains, we hypothesize that the intrinsic randomness in microelectronics devices could be turned into a valuable component of a neuromorphic architecture enabling more efficient computations.
no code implementations • 23 Mar 2022 • James B. Aimone, Aaron J. Hill, William M. Severa, Craig M. Vineyard
Boolean functions and binary arithmetic operations are central to standard computing paradigms.
no code implementations • 27 Jul 2021 • J. Darby Smith, Aaron J. Hill, Leah E. Reeder, Brian C. Franke, Richard B. Lehoucq, Ojas Parekh, William Severa, James B. Aimone
Computing stands to be radically improved by neuromorphic computing (NMC) approaches inspired by the brain's incredible efficiency and capabilities.
no code implementations • 25 Jun 2020 • Ojas Parekh, Cynthia A. Phillips, Conrad D. James, James B. Aimone
Boolean circuits of McCulloch-Pitts threshold gates are a classic model of neural computation studied heavily in the late 20th century as a model of general computation.
no code implementations • 21 May 2020 • J. Darby Smith, William Severa, Aaron J. Hill, Leah Reeder, Brian Franke, Richard B. Lehoucq, Ojas D. Parekh, James B. Aimone
The widely parallel, spiking neural networks of neuromorphic processors can enable computationally powerful formulations.
no code implementations • 28 May 2019 • James B. Aimone, William Severa, Craig M. Vineyard
Rather than necessitating a developer attain intricate knowledge of how to program and exploit spiking neural dynamics to utilize the potential benefits of neuromorphic computing, Fugu is designed to provide a higher level abstraction as a hardware-independent mechanism for linking a variety of scalable spiking neural algorithms from a variety of sources.
no code implementations • 26 Oct 2018 • William Severa, Craig M. Vineyard, Ryan Dellana, Stephen J. Verzi, James B. Aimone
We present a method for training deep spiking neural networks using an iterative modification of the backpropagation optimization algorithm.
no code implementations • 25 Sep 2018 • Aleksandra Faust, James B. Aimone, Conrad D. James, Lydia Tapia
Robots and autonomous agents often complete goal-based tasks with limited resources, relying on imperfect models and sensor measurements.
no code implementations • 1 May 2018 • William Severa, Rich Lehoucq, Ojas Parekh, James B. Aimone
The random walk is a fundamental stochastic process that underlies many numerical tasks in scientific computing applications.
no code implementations • 27 Nov 2017 • James B. Aimone, William M. Severa
Complex architectures of biological neural circuits, such as parallel processing pathways, has been behaviorally implicated in many cognitive studies.
no code implementations • 10 Nov 2017 • Michael R. Smith, Joe B. Ingram, Christopher C. Lamb, Timothy J. Draelos, Justin E. Doak, James B. Aimone, Conrad D. James
It is needed to ensure the integrity of systems that process sensitive information and control many aspects of everyday life.
no code implementations • ICLR 2018 • William M. Severa, Jerilyn A. Timlin, Suraj Kholwadwala, Conrad D. James, James B. Aimone
The high dimensionality of hyperspectral imaging forces unique challenges in scope, size and processing requirements.
no code implementations • 4 May 2017 • James B. Aimone
Although the brain has long been considered a potential inspiration for future computing, Moore's Law - the scaling property that has seen revolutions in technologies ranging from supercomputers to smart phones - has largely been driven by advances in materials science.
no code implementations • 21 Mar 2017 • Michael R. Smith, Aaron J. Hill, Kristofor D. Carlson, Craig M. Vineyard, Jonathon Donaldson, David R. Follett, Pamela L. Follett, John H. Naegle, Conrad D. James, James B. Aimone
Information in neural networks is represented as weighted connections, or synapses, between neurons.
no code implementations • 12 Dec 2016 • Timothy J. Draelos, Nadine E. Miner, Christopher C. Lamb, Jonathan A. Cox, Craig M. Vineyard, Kristofor D. Carlson, William M. Severa, Conrad D. James, James B. Aimone
Neural machine learning methods, such as deep neural networks (DNN), have achieved remarkable success in a number of complex data processing tasks.