Search Results for author: Darius Bunandar

Found 9 papers, 1 papers with code

Photonics for Sustainable Computing

no code implementations10 Jan 2024 Farbin Fayza, Satyavolu Papa Rao, Darius Bunandar, Udit Gupta, Ajay Joshi

Our analysis shows that photonics can reduce both operational and embodied carbon footprints with its high energy efficiency and at least 4$\times$ less fabrication carbon cost per unit area than 28 nm CMOS.

Towards Efficient Hyperdimensional Computing Using Photonics

no code implementations29 Nov 2023 Farbin Fayza, Cansu Demirkiran, Hanning Chen, Che-Kai Liu, Avi Mohan, Hamza Errahmouni, Sanggeon Yun, Mohsen Imani, David Zhang, Darius Bunandar, Ajay Joshi

Over the past few years, silicon photonics-based computing has emerged as a promising alternative to CMOS-based computing for Deep Neural Networks (DNN).

Accelerating DNN Training With Photonics: A Residue Number System-Based Design

no code implementations29 Nov 2023 Cansu Demirkiran, Guowei Yang, Darius Bunandar, Ajay Joshi

Photonic computing is a compelling avenue for performing highly efficient matrix multiplication, a crucial operation in Deep Neural Networks (DNNs).

A Blueprint for Precise and Fault-Tolerant Analog Neural Networks

no code implementations19 Sep 2023 Cansu Demirkiran, Lakshmi Nair, Darius Bunandar, Ajay Joshi

Our study demonstrates that analog accelerators utilizing the RNS-based approach can achieve ${\geq}99\%$ of FP32 accuracy for state-of-the-art DNN inference using data converters with only $6$-bit precision whereas a conventional analog core requires more than $8$-bit precision to achieve the same accuracy in the same DNNs.

INT-FP-QSim: Mixed Precision and Formats For Large Language Models and Vision Transformers

1 code implementation7 Jul 2023 Lakshmi Nair, Mikhail Bernadskiy, Arulselvan Madhavan, Craig Chan, Ayon Basumallik, Darius Bunandar

To supplement this ongoing effort, we propose INT-FP-QSim: an open-source simulator that enables flexible evaluation of LLMs and vision transformers at various numerical precisions and formats.

Quantization

Leveraging Residue Number System for Designing High-Precision Analog Deep Neural Network Accelerators

no code implementations15 Jun 2023 Cansu Demirkiran, Rashmi Agrawal, Vijay Janapa Reddi, Darius Bunandar, Ajay Joshi

In addition, we show that RNS can reduce the energy consumption of the data converters within an analog accelerator by several orders of magnitude compared to a regular fixed-point approach.

Sensitivity-Aware Finetuning for Accuracy Recovery on Deep Learning Hardware

no code implementations5 Jun 2023 Lakshmi Nair, Darius Bunandar

Existing methods to recover model accuracy on analog-digital hardware in the presence of quantization and analog noise include noise-injection training.

Quantization

Adaptive Block Floating-Point for Analog Deep Learning Hardware

no code implementations12 May 2022 Ayon Basumallik, Darius Bunandar, Nicholas Dronen, Nicholas Harris, Ludmila Levkova, Calvin Mccarter, Lakshmi Nair, David Walter, David Widemann

Analog mixed-signal (AMS) devices promise faster, more energy-efficient deep neural network (DNN) inference than their digital counterparts.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.