no code implementations • 29 Sep 2021 • Sarada Krithivasan, Swagath Venkataramani, Sanchari Sen, Anand Raghunathan
This is because the efficacy of learning on interpolated inputs is reduced by the interference between the forward/backward propagation of their constituent inputs.
no code implementations • 29 Sep 2021 • Amrit Nagarajan, Sanchari Sen, Jacob R. Stevens, Anand Raghunathan
We propose a Specialization framework to create optimized transformer models for a given downstream task.
no code implementations • 1 Jan 2021 • Sarada Krithivasan, Sanchari Sen, Swagath Venkataramani, Anand Raghunathan
The trend in the weight updates made to the transition layer across epochs is used to determine how the boundary betweenSGD and localized updates is shifted in future epochs.
1 code implementation • 7 Oct 2020 • Amrit Nagarajan, Sanchari Sen, Jacob R. Stevens, Anand Raghunathan
We propose AxFormer, a systematic framework that applies accuracy-driven approximations to create optimized transformer models for a given downstream task.
no code implementations • 14 Jun 2020 • Sarada Krithivasan, Sanchari Sen, Anand Raghunathan
We also evaluate the impact of the attack on a sparsity-optimized DNN accelerator and demonstrate degradations up to 1. 59x in latency, and also study the performance of the attack on a sparsity-optimized general-purpose processor.
1 code implementation • ICLR 2020 • Sanchari Sen, Balaraman Ravindran, Anand Raghunathan
Our results indicate that EMPIR boosts the average adversarial accuracies by 42. 6%, 15. 2% and 10. 5% for the DNN models trained on the MNIST, CIFAR-10 and ImageNet datasets respectively, when compared to single full-precision models, without sacrificing accuracy on the unperturbed inputs.
no code implementations • 7 Nov 2017 • Sanchari Sen, Shubham Jain, Swagath Venkataramani, Anand Raghunathan
SparCE consists of 2 key micro-architectural enhancements- a Sparsity Register File (SpRF) that tracks zero registers and a Sparsity aware Skip Address (SASA) table that indicates instructions to be skipped.