Search Results for author: Mitch Paul Mithun

Found 2 papers, 0 papers with code

Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification

no code implementations EMNLP 2021 Mitch Paul Mithun, Sandeep Suntwal, Mihai Surdeanu

While neural networks produce state-of-the- art performance in several NLP tasks, they generally depend heavily on lexicalized information, which transfer poorly between domains.

Fact Verification Knowledge Distillation

Data and Model Distillation as a Solution for Domain-transferable Fact Verification

no code implementations NAACL 2021 Mitch Paul Mithun, Sandeep Suntwal, Mihai Surdeanu

While neural networks produce state-of-the-art performance in several NLP tasks, they generally depend heavily on lexicalized information, which transfer poorly between domains.

Fact Verification

Cannot find the paper you are looking for? You can Submit a new open access paper.