Search Results for author: Venkata Prabhakara Sarath Nookala

Found 2 papers, 1 papers with code

DynaQuant: Compressing Deep Learning Training Checkpoints via Dynamic Quantization

no code implementations20 Jun 2023 Amey Agrawal, Sameer Reddy, Satwik Bhattamishra, Venkata Prabhakara Sarath Nookala, Vidushi Vashishth, Kexin Rong, Alexey Tumanov

With the increase in the scale of Deep Learning (DL) training workloads in terms of compute resources and time consumption, the likelihood of encountering in-training failures rises substantially, leading to lost work and resource wastage.

Model Compression Quantization +1

Adversarial Robustness of Prompt-based Few-Shot Learning for Natural Language Understanding

1 code implementation19 Jun 2023 Venkata Prabhakara Sarath Nookala, Gaurav Verma, Subhabrata Mukherjee, Srijan Kumar

Our results on six GLUE tasks indicate that compared to fully fine-tuned models, vanilla FSL methods lead to a notable relative drop in task performance (i. e., are less robust) in the face of adversarial perturbations.

Adversarial Robustness Few-Shot Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.