28 papers with code • 2 benchmarks • 1 datasets
The cloze task refers to infilling individual words.
These leaderboards are used to track progress in Cloze Test
Most implemented papers
Bidirectional Attention Flow for Machine Comprehension
Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query.
Improving Language Understanding by Generative Pre-Training
We demonstrate that large gains on these tasks can be realized by generative pre-training of a language model on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each specific task.
ERNIE: Enhanced Representation through Knowledge Integration
We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration).
CPM: A Large-scale Generative Chinese Pre-trained Language Model
However, applying GPT-3 to address Chinese NLP tasks is still challenging, as the training corpus of GPT-3 is primarily English, and the parameters are not publicly available.
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
Benchmark datasets have a significant impact on accelerating research in programming language tasks.
Large-scale Cloze Test Dataset Created by Teachers
Cloze tests are widely adopted in language exams to evaluate students' language proficiency.
ChID: A Large-scale Chinese IDiom Dataset for Cloze Test
Cloze-style reading comprehension in Chinese is still limited due to the lack of various corpora.
LSDSem 2017: Exploring Data Generation Methods for the Story Cloze Test
The Story Cloze test is a recent effort in providing a common test scenario for text understanding systems.
Narrative Modeling with Memory Chains and Semantic Supervision
Story comprehension requires a deep semantic understanding of the narrative, making it a challenging task.
Chengyu Cloze Test
We present a neural recommendation model for Chengyu, which is a special type of Chinese idiom.