no code implementations • 29 Nov 2022 • Sushil Thapa
Deep learning has been the subject of growing interest in recent years.
no code implementations • 18 Oct 2022 • Sushil Thapa
Through this work, we show that using guidance and knowledge from a larger teacher network during fine-tuning, we can improve the student network to achieve better validation performances like accuracy.
1 code implementation • 15 May 2021 • Sunil Thulasidasan, Sushil Thapa, Sayera Dhaubhadel, Gopinath Chennupati, Tanmoy Bhattacharya, Jeff Bilmes
In this work we present a simple, but highly effective approach to deal with out-of-distribution detection that uses the principle of abstention: when encountering a sample from an unseen class, the desired behavior is to abstain from predicting.
Ranked #1 on Out-of-Distribution Detection on CIFAR-100 (using extra training data)
no code implementations • 1 Jan 2021 • Sunil Thulasidasan, Sushil Thapa, Sayera Dhaubhadel, Gopinath Chennupati, Tanmoy Bhattacharya, Jeff Bilmes
In this work we present a simple, but highly effective approach to deal with out-of-distribution detection that uses the principle of abstention: when encountering a sample from an unseen class, the desired behavior is to abstain from predicting.