no code implementations • ICLR Workshop LLD 2019 • Tyler Lee, Ting Gong, Suchismita Padhy, Andrew Rouditchenko, Anthony Ndirango
We demonstrate that, in scenarios with limited labeled training data, one can significantly improve the performance of three different supervised classification tasks individually by up to 6% through simultaneous training with these additional self-supervised tasks.
no code implementations • NeurIPS 2019 • Tyler Lee, Anthony Ndirango
There has also been a recent interest in extending these analyses to understanding how multitask learning can further improve the generalization capacity of deep neural nets.
no code implementations • 1 Jan 2021 • Anthony Ndirango
It is all but certain that machine learning models based on deep neural networks will soon feature ubiquitously in a wide variety of critical products and services that people rely on.
no code implementations • 26 Aug 2021 • Landan Seguin, Anthony Ndirango, Neeli Mishra, SueYeon Chung, Tyler Lee
Motivated by a recent study on learning robustness without input perturbations by distilling an AT model, we explore what is learned during adversarial training by analyzing the distribution of logits in AT models.