1 code implementation • ICLR 2022 • Divyam Madaan, Jaehong Yoon, Yuanchun Li, Yunxin Liu, Sung Ju Hwang
Continual learning (CL) aims to learn a sequence of tasks without forgetting the previously acquired knowledge.
no code implementations • ICLR 2022 • Jaehong Yoon, Divyam Madaan, Eunho Yang, Sung Ju Hwang
We validate the effectiveness of our coreset selection mechanism over various standard, imbalanced, and noisy datasets against strong continual learning baselines, demonstrating that it improves task adaptation and prevents catastrophic forgetting in a sample-efficient manner.
1 code implementation • 22 Jun 2020 • Divyam Madaan, Jinwoo Shin, Sung Ju Hwang
Adversarial learning has emerged as one of the successful techniques to circumvent the susceptibility of existing methods against adversarial perturbations.
1 code implementation • ICML 2020 • Divyam Madaan, Jinwoo Shin, Sung Ju Hwang
Despite the remarkable performance of deep neural networks on various computer vision tasks, they are known to be susceptible to adversarial perturbations, which makes it challenging to deploy them in real-world safety-critical applications.
2 code implementations • 31 May 2019 • Aidan N. Gomez, Ivan Zhang, Siddhartha Rao Kamalakara, Divyam Madaan, Kevin Swersky, Yarin Gal, Geoffrey E. Hinton
Before computing the gradients for each weight update, targeted dropout stochastically selects a set of units or weights to be dropped using a simple self-reinforcing sparsity criterion and then computes the gradients for the remaining weights.
no code implementations • 8 Apr 2019 • Divyam Madaan, Radhika Dua, Prerana Mukherjee, Brejesh lall
Extensive experiments on data sources obtained in Delhi demonstrate that the proposed adaptive attention based Bidirectional LSTM Network outperforms several baselines for classification and regression models.