Search Results for author: Mingye Gao

Found 2 papers, 2 papers with code

Cooperative Self-training of Machine Reading Comprehension

1 code implementation NAACL 2022 Hongyin Luo, Shang-Wen Li, Mingye Gao, Seunghak Yu, James Glass

Pretrained language models have significantly improved the performance of downstream language understanding tasks, including extractive question answering, by providing high-quality contextualized word embeddings.

Machine Reading Comprehension Pretrained Language Models +5

Cannot find the paper you are looking for? You can Submit a new open access paper.