Sentence Completion
16 papers with code • 1 benchmarks • 1 datasets
Libraries
Use these libraries to find Sentence Completion models and implementationsMost implemented papers
Language Models are Few-Shot Learners
By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.
Finetuned Language Models Are Zero-Shot Learners
We show that instruction tuning -- finetuning language models on a collection of tasks described via instructions -- substantially improves zero-shot performance on unseen tasks.
Recurrent Memory Networks for Language Modeling
In this paper, we propose Recurrent Memory Network (RMN), a novel RNN architecture, that not only amplifies the power of RNN but also facilitates our understanding of its internal functioning and allows us to discover underlying patterns in data.
Muppet: Massive Multi-task Representations with Pre-Finetuning
We propose pre-finetuning, an additional large-scale learning stage between language model pre-training and fine-tuning.
Dependency Recurrent Neural Language Models for Sentence Completion
Recent work on language modelling has shifted focus from count-based models to neural models.
Top-down Tree Long Short-Term Memory Networks
Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks.
A Deep Architecture for Semantic Matching with Multiple Positional Sentence Representations
Our model has several advantages: (1) By using Bi-LSTM, rich context of the whole sentence is leveraged to capture the contextualized local information in each positional sentence representation; (2) By matching with multiple positional sentence representations, it is flexible to aggregate different important contextualized local information in a sentence to support the matching; (3) Experiments on different tasks such as question answering and sentence completion demonstrate the superiority of our model.
Learning Semantically and Additively Compositional Distributional Representations
This paper connects a vector-based composition model to a formal semantics, the Dependency-based Compositional Semantics (DCS).
CODAH: An Adversarially Authored Question-Answer Dataset for Common Sense
To produce a more difficult dataset, we introduce a novel procedure for question acquisition in which workers author questions designed to target weaknesses of state-of-the-art neural question answering systems.
CODAH: An Adversarially-Authored Question Answering Dataset for Common Sense
To produce a more difficult dataset, we introduce a novel procedure for question acquisition in which workers author questions designed to target weaknesses of state-of-the-art neural question answering systems.