1 code implementation • 3 Nov 2023 • Nan Zhang, Yusen Zhang, Wu Guo, Prasenjit Mitra, Rui Zhang
In this paper, we investigate and improve faithfulness in summarization on a broad range of medical summarization tasks.
no code implementations • 6 May 2023 • Beiduo Chen, Shaohan Huang, Zihan Zhang, Wu Guo, ZhenHua Ling, Haizhen Huang, Furu Wei, Weiwei Deng, Qi Zhang
Besides, two self-correction courses are proposed to bridge the chasm between the two encoders by creating a "correction notebook" for secondary-supervision.
1 code implementation • 28 Feb 2023 • Zhijie Shen, Wu Guo, Bin Gu
In this paper, we propose a language-universal adapter learning framework based on a pre-trained model for end-to-end multilingual automatic speech recognition (ASR).
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
1 code implementation • 7 Dec 2022 • Jun-Yu Ma, Beiduo Chen, Jia-Chen Gu, Zhen-Hua Ling, Wu Guo, Quan Liu, Zhigang Chen, Cong Liu
In this study, a mixture of short-channel distillers (MSD) method is proposed to fully interact the rich hierarchical information in the teacher model and to transfer knowledge to the student model sufficiently and efficiently.
no code implementations • 17 May 2022 • Beiduo Chen, Wu Guo, Quan Liu, Kun Tao
Multilingual BERT (mBERT), a language model pre-trained on large multilingual corpora, has impressive zero-shot cross-lingual transfer capabilities and performs surprisingly well on zero-shot POS tagging and Named Entity Recognition (NER), as well as on cross-lingual model transfer.
1 code implementation • SemEval (NAACL) 2022 • Beiduo Chen, Jun-Yu Ma, Jiajun Qi, Wu Guo, Zhen-Hua Ling, Quan Liu
The proposed method is applied to several state-of-the-art Transformer-based NER models with a gazetteer built from Wikidata, and shows great generalization ability across them.
no code implementations • 26 Feb 2022 • Beiduo Chen, Wu Guo, Bin Gu, Quan Liu, Yongchao Wang
Cross-language pre-trained models such as multilingual BERT (mBERT) have achieved significant performance in various cross-lingual downstream NLP tasks.
no code implementations • 16 Jun 2021 • Tan Liu, Wu Guo, Bin Gu
In this paper, instead of using the ASR transcripts, the fusion of deep acoustic and linguistic features is used for topic classification on spoken documents.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +3
no code implementations • 1 Apr 2021 • Jiajun Qi, Wu Guo, Bin Gu
In this paper, we propose a novel bidirectional multiscale feature aggregation (BMFA) network with attentional fusion modules for text-independent speaker verification.
no code implementations • 29 Mar 2021 • Yafeng Chen, Wu Guo, Bin Gu
By combining these two methods, we can observe further improvements on these two databases.
no code implementations • 21 Oct 2020 • Yafeng Chen, Wu Guo, Jingjing Shi, Jiajun Qi, Tan Liu
To evaluate the proposed method, we conduct experiments on the Speaker in the Wild (SITW) dataset.
no code implementations • 13 Oct 2020 • Jiajun Qi, Wu Guo, Jingjing Shi, Yafeng Chen, Tan Liu
The universal speech attributes for x-vector based speaker verification (SV) are addressed in this paper.
no code implementations • 28 Mar 2019 • Lanhua You, Wu Guo, LiRong Dai, Jun Du
The x-vector based deep neural network (DNN) embedding systems have demonstrated effectiveness for text-independent speaker verification.
no code implementations • 28 Mar 2019 • Lanhua You, Wu Guo, Li-Rong Dai, Jun Du
In this paper, gating mechanisms are applied in deep neural network (DNN) training for x-vector-based text-independent speaker verification.
no code implementations • IWSLT (EMNLP) 2018 • Dan Liu, Junhua Liu, Wu Guo, Shifu Xiong, Zhiqiang Ma, Rui Song, Chongliang Wu, Quan Liu
This paper describes the USTC-NEL system to the speech translation task of the IWSLT Evaluation 2018.
no code implementations • 7 Sep 2015 • Quan Liu, Wu Guo, Zhen-Hua Ling
The confidence measure of each term occurrence is then re-estimated through linear interpolation with the calculated document ranking weight to improve its reliability by integrating document-level information.