1 code implementation • CVPR 2023 • A. Tuan Nguyen, Thanh Nguyen-Tang, Ser-Nam Lim, Philip H.S. Torr
Test Time Adaptation offers a means to combat this problem, as it allows the model to adapt during test time to the new data distribution, using only unlabeled test data batches.
no code implementations • 15 Mar 2022 • A. Tuan Nguyen, Ser Nam Lim, Philip Torr
To tackle this problem, a great amount of research has been done to study the training procedure of a network to improve its robustness.
1 code implementation • ICLR 2022 • Thanh Nguyen-Tang, Sunil Gupta, A. Tuan Nguyen, Svetha Venkatesh
Moreover, we show that our method is more computationally efficient and has a better dependence on the effective dimension of the neural network than an online counterpart.
no code implementations • 29 Sep 2021 • Andreis Bruno, Seanie Lee, A. Tuan Nguyen, Juho Lee, Eunho Yang, Sung Ju Hwang
Deep Learning algorithms are designed to operate on huge volumes of high dimensional data such as images.
1 code implementation • ICLR 2022 • A. Tuan Nguyen, Toan Tran, Yarin Gal, Philip H. S. Torr, Atılım Güneş Baydin
A common approach in the domain adaptation literature is to learn a representation of the input that has the same (marginal) distribution over the source and the target domain.
1 code implementation • NeurIPS 2021 • A. Tuan Nguyen, Toan Tran, Yarin Gal, Atılım Güneş Baydin
Domain generalization refers to the problem where we aim to train a model on data from a set of source domains so that the model can generalize to unseen target domains.
no code implementations • 25 Jun 2020 • Bruno Andreis, Seanie Lee, A. Tuan Nguyen, Juho Lee, Eunho Yang, Sung Ju Hwang
Deep models are designed to operate on huge volumes of high dimensional data such as images.
2 code implementations • 23 Jun 2020 • A. Tuan Nguyen, Hyewon Jeong, Eunho Yang, Sung Ju Hwang
Existing asymmetric multi-task learning methods tackle this negative transfer problem by performing knowledge transfer from tasks with low loss to tasks with high loss.