Search Results for author: Sandeep Suntwal

Found 3 papers, 0 papers with code

Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification

no code implementations EMNLP 2021 Mitch Paul Mithun, Sandeep Suntwal, Mihai Surdeanu

While neural networks produce state-of-the- art performance in several NLP tasks, they generally depend heavily on lexicalized information, which transfer poorly between domains.

Fact Verification Knowledge Distillation

Data and Model Distillation as a Solution for Domain-transferable Fact Verification

no code implementations NAACL 2021 Mitch Paul Mithun, Sandeep Suntwal, Mihai Surdeanu

While neural networks produce state-of-the-art performance in several NLP tasks, they generally depend heavily on lexicalized information, which transfer poorly between domains.

Fact Verification

On the Importance of Delexicalization for Fact Verification

no code implementations IJCNLP 2019 Sandeep Suntwal, Mithun Paul, Rebecca Sharp, Mihai Surdeanu

As expected, even though this method achieves high accuracy when evaluated in the same domain, the performance in the target domain is poor, marginally above chance. To mitigate this dependence on lexicalized information, we experiment with several strategies for masking out names by replacing them with their semantic category, coupled with a unique identifier to mark that the same or new entities are referenced between claim and evidence.

Fact Verification Natural Language Inference +2

Cannot find the paper you are looking for? You can Submit a new open access paper.