1 code implementation • COLING 2022 • Zhongyuan Wang, YiXuan Wang, Shaolei Wang, Wanxiang Che
Supervised methods have achieved remarkable results in disfluency detection.
1 code implementation • 14 Nov 2023 • Kunting Li, Yong Hu, Shaolei Wang, Hanhan Ma, Liang He, Fandong Meng, Jie zhou
However, in the Chinese Spelling Correction (CSC) task, we observe a discrepancy: while ChatGPT performs well under human evaluation, it scores poorly according to traditional metrics.
1 code implementation • EMNLP 2020 • Shaolei Wang, Zhongyuan Wang, Wanxiang Che, Ting Liu
Most existing approaches to disfluency detection heavily rely on human-annotated corpora, which is expensive to obtain in practice.
no code implementations • 1 Oct 2020 • Shaolei Wang, Baoxin Wang, Jiefu Gong, Zhongyuan Wang, Xiao Hu, Xingyi Duan, Zizhuo Shen, Gang Yue, Ruiji Fu, Dayong Wu, Wanxiang Che, Shijin Wang, Guoping Hu, Ting Liu
Grammatical error diagnosis is an important task in natural language processing.
no code implementations • 15 Aug 2019 • Shaolei Wang, Wanxiang Che, Qi Liu, Pengda Qin, Ting Liu, William Yang Wang
The pre-trained network is then fine-tuned using human-annotated disfluency detection training data.
1 code implementation • EMNLP 2017 • Shaolei Wang, Wanxiang Che, Yue Zhang, Meishan Zhang, Ting Liu
In this paper, we model the problem of disfluency detection using a transition-based framework, which incrementally constructs and labels the disfluency chunk of input sentences using a new transition system without syntax information.
no code implementations • COLING 2016 • Shaolei Wang, Wanxiang Che, Ting Liu
We treat disfluency detection as a sequence-to-sequence problem and propose a neural attention-based model which can efficiently model the long-range dependencies between words and make the resulting sentence more likely to be grammatically correct.