Improving Seq2Seq Grammatical Error Correction via Decoding Interventions

23 Oct 2023  ·  Houquan Zhou, Yumeng Liu, Zhenghua Li, Min Zhang, Bo Zhang, Chen Li, Ji Zhang, Fei Huang ·

The sequence-to-sequence (Seq2Seq) approach has recently been widely used in grammatical error correction (GEC) and shows promising performance. However, the Seq2Seq GEC approach still suffers from two issues. First, a Seq2Seq GEC model can only be trained on parallel data, which, in GEC task, is often noisy and limited in quantity. Second, the decoder of a Seq2Seq GEC model lacks an explicit awareness of the correctness of the token being generated. In this paper, we propose a unified decoding intervention framework that employs an external critic to assess the appropriateness of the token to be generated incrementally, and then dynamically influence the choice of the next token. We discover and investigate two types of critics: a pre-trained left-to-right language model critic and an incremental target-side grammatical error detector critic. Through extensive experiments on English and Chinese datasets, our framework consistently outperforms strong baselines and achieves results competitive with state-of-the-art methods.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Grammatical Error Correction BEA-2019 (test) GEC-DI (LM+GED) F0.5 73.1 # 9
Grammatical Error Correction CoNLL-2014 Shared Task GEC-DI (LM+GED) F0.5 69.6 # 4
Precision 79.2 # 5
Recall 46.8 # 4
Grammatical Error Correction MuCGEC GEC-DI (LM+GED) F0.5 48.61 # 1

Methods