no code implementations • 30 Apr 2019 • Gokce Keskin, Tyler Lee, Cory Stephenson, Oguz H. Elibol
We present a Cycle-GAN based many-to-many voice conversion method that can convert between speakers that are not in the training set.
no code implementations • ICLR Workshop LLD 2019 • Tyler Lee, Ting Gong, Suchismita Padhy, Andrew Rouditchenko, Anthony Ndirango
We demonstrate that, in scenarios with limited labeled training data, one can significantly improve the performance of three different supervised classification tasks individually by up to 6% through simultaneous training with these additional self-supervised tasks.
no code implementations • NeurIPS 2019 • Tyler Lee, Anthony Ndirango
There has also been a recent interest in extending these analyses to understanding how multitask learning can further improve the generalization capacity of deep neural nets.
no code implementations • 26 Aug 2021 • Landan Seguin, Anthony Ndirango, Neeli Mishra, SueYeon Chung, Tyler Lee
Motivated by a recent study on learning robustness without input perturbations by distilling an AT model, we explore what is learned during adversarial training by analyzing the distribution of logits in AT models.
no code implementations • 26 Aug 2021 • Cory Stephenson, Tyler Lee
This model is based on the hypothesis that the training data contains features that are slow to learn but informative.