TANDA: Transfer and Adapt Pre-Trained Transformer Models for Answer Sentence Selection

AAAI 2020 2019 Siddhant GargThuy VuAlessandro Moschitti

We propose TANDA, an effective technique for fine-tuning pre-trained Transformer models for natural language tasks. Specifically, we first transfer a pre-trained model into a model for a general task by fine-tuning it with a large and high-quality dataset... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Question Answering TrecQA TANDA-RoBERTa (ASNQ, TREC-QA) MAP 0.943 # 1
Question Answering TrecQA TANDA-RoBERTa (ASNQ, TREC-QA) MRR 0.974 # 1
Question Answering WikiQA TANDA-RoBERTa (ASNQ, WikiQA) MAP 0.920 # 1
Question Answering WikiQA TANDA-RoBERTa (ASNQ, WikiQA) MRR 0.933 # 1