no code implementations • 18 Jun 2024 • Zhaoxian Wu, Tayfun Gokmen, Malte J. Rasch, Tianyi Chen
Given the high economic and environmental costs of using large vision or language models, analog in-memory accelerators present a promising solution for energy-efficient AI.
1 code implementation • 18 Jul 2023 • Manuel Le Gallo, Corey Lammie, Julian Buechel, Fabio Carta, Omobayode Fagbohungbe, Charles Mackin, Hsinyu Tsai, Vijay Narayanan, Abu Sebastian, Kaoutar El Maghraoui, Malte J. Rasch
In this tutorial, we provide a deep dive into how such adaptations can be achieved and evaluated using the recently released IBM Analog Hardware Acceleration Kit (AIHWKit), freely available at https://github. com/IBM/aihwkit.
no code implementations • 8 Mar 2023 • Malte J. Rasch, Fabio Carta, Omebayode Fagbohungbe, Tayfun Gokmen
In-memory computing with resistive crossbar arrays has been suggested to accelerate deep-learning workloads in highly efficient manner.
no code implementations • 16 Feb 2023 • Malte J. Rasch, Charles Mackin, Manuel Le Gallo, An Chen, Andrea Fasoli, Frederic Odermatt, Ning li, S. R. Nandakumar, Pritish Narayanan, Hsinyu Tsai, Geoffrey W. Burr, Abu Sebastian, Vijay Narayanan
Analog in-memory computing (AIMC) -- a promising approach for energy-efficient acceleration of deep learning workloads -- computes matrix-vector multiplications (MVMs) but only approximately, due to nonidealities that often are non-deterministic or nonlinear.
1 code implementation • 5 Apr 2021 • Malte J. Rasch, Diego Moreda, Tayfun Gokmen, Manuel Le Gallo, Fabio Carta, Cindy Goldberg, Kaoutar El Maghraoui, Abu Sebastian, Vijay Narayanan
We introduce the IBM Analog Hardware Acceleration Kit, a new and first of a kind open source toolkit to simulate analog crossbar arrays in a convenient fashion from within PyTorch (freely available at https://github. com/IBM/aihwkit).
no code implementations • 6 Jun 2019 • Malte J. Rasch, Tayfun Gokmen, Wilfried Haensch
Accelerating training of artificial neural networks (ANN) with analog resistive crossbar arrays is a promising idea.
no code implementations • 3 Jul 2018 • Malte J. Rasch, Tayfun Gokmen, Mattia Rigotti, Wilfried Haensch
Analog arrays are a promising upcoming hardware technology with the potential to drastically speed up deep learning.