Grammatical Error Correction
134 papers with code • 13 benchmarks • 16 datasets
Grammatical Error Correction (GEC) is the task of correcting different kinds of errors in text such as spelling, punctuation, grammatical, and word choice errors.
GEC is typically formulated as a sentence correction task. A GEC system takes a potentially erroneous sentence as input and is expected to transform it to its corrected version. See the example given below:
Input (Erroneous) | Output (Corrected) |
---|---|
She see Tom is catched by policeman in park at last night. | She saw Tom caught by a policeman in the park last night. |
Libraries
Use these libraries to find Grammatical Error Correction models and implementationsDatasets
Most implemented papers
Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data
It is the first time copying words from the source context and fully pre-training a sequence to sequence model are experimented on the GEC task.
A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction
We improve automatic correction of grammatical, orthographic, and collocation errors in text using a multilayer convolutional encoder-decoder neural network.
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
In this paper, we present a simple and efficient GEC sequence tagger using a Transformer encoder.
Neural Network Translation Models for Grammatical Error Correction
Phrase-based statistical machine translation (SMT) systems have previously been used for the task of grammatical error correction (GEC) to achieve state-of-the-art accuracy.
The Unreasonable Effectiveness of Transformer Language Models in Grammatical Error Correction
Recent work on Grammatical Error Correction (GEC) has highlighted the importance of language modeling in that it is certainly possible to achieve good performance by comparing the probabilities of the proposed edits.
A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning
The resulting parallel corpora are subsequently used to pre-train Transformer models.
Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model
In this study, we explore the utility of bidirectional and auto-regressive transformers (BART) as a generic pretrained encoder-decoder model for GEC.
A Simple Recipe for Multilingual Grammatical Error Correction
This paper presents a simple recipe to train state-of-the-art multilingual Grammatical Error Correction (GEC) models.
LM-Critic: Language Models for Unsupervised Grammatical Error Correction
Training a model for grammatical error correction (GEC) requires a set of labeled ungrammatical / grammatical sentence pairs, but manually annotating such pairs can be expensive.
Automatic Error Type Annotation for Arabic
We present ARETA, an automatic error type annotation system for Modern Standard Arabic.