Grammatical Error Detection
17 papers with code • 4 benchmarks • 4 datasets
Grammatical Error Detection (GED) is the task of detecting different kinds of errors in text such as spelling, punctuation, grammatical, and word choice errors. Grammatical error detection (GED) is one of the key component in grammatical error correction (GEC) community.
Most implemented papers
Semi-supervised Multitask Learning for Sequence Labeling
We propose a sequence labeling framework with a secondary training objective, learning to predict surrounding words for every word in the dataset.
Jointly Learning to Label Sentences and Tokens
Learning to construct text representations in end-to-end systems can be difficult, as natural languages are highly compositional and task-specific annotated datasets are often limited in size.
FCGEC: Fine-Grained Corpus for Chinese Grammatical Error Correction
Grammatical Error Correction (GEC) has been broadly applied in automatic correction and proofreading system recently.
Bangla Grammatical Error Detection Using T5 Transformer Model
This paper presents a method for detecting grammatical errors in Bangla using a Text-to-Text Transfer Transformer (T5) Language Model, using the small variant of BanglaT5, fine-tuned on a corpus of 9385 sentences where errors were bracketed by the dedicated demarcation symbol.
Grammatical Error Detection Using Error- and Grammaticality-Specific Word Embeddings
In this study, we improve grammatical error detection by learning word embeddings that consider grammaticality and error patterns.
Wronging a Right: Generating Better Errors to Improve Grammatical Error Detection
Grammatical error correction, like other machine learning tasks, greatly benefits from large quantities of high quality training data, which is typically expensive to produce.
Sequence Classification with Human Attention
Learning attention functions requires large volumes of data, but many NLP tasks simulate human behavior, and in this paper, we show that human attention really does provide a good inductive bias on many attention functions in NLP.
Detecting Local Insights from Global Labels: Supervised & Zero-Shot Sequence Labeling via a Convolutional Decomposition
From this sequence-labeling layer we derive dense representations of the input that can then be matched to instances from training, or a support set with known labels.
Context is Key: Grammatical Error Detection with Contextual Word Representations
Grammatical error detection (GED) in non-native writing requires systems to identify a wide range of errors in text written by language learners.