Search Results for author: Jingming Liu

Found 8 papers, 4 papers with code

Aligning Cross-lingual Sentence Representations with Dual Momentum Contrast

no code implementations EMNLP 2021 Liang Wang, Wei Zhao, Jingming Liu

In this paper, we propose to align sentence representations from different languages into a unified embedding space, where semantic similarities (both cross-lingual and monolingual) can be computed with a simple dot product.

Semantic Textual Similarity Sentence +1

Ape210K: A Large-Scale and Template-Rich Dataset of Math Word Problems

1 code implementation24 Sep 2020 Wei Zhao, Mingyue Shang, Yang Liu, Liang Wang, Jingming Liu

We propose a copy-augmented and feature-enriched sequence to sequence (seq2seq) model, which outperforms existing models by 3. 2% on the Math23K dataset and serves as a strong baseline of the Ape210K dataset.

Math Math Word Problem Solving +1

Investigating Label Bias in Beam Search for Open-ended Text Generation

no code implementations22 May 2020 Liang Wang, Jinlong Liu, Jingming Liu

However, in open-ended text generation, beam search is often found to produce repetitive and generic texts, sampling-based decoding algorithms like top-k sampling and nucleus sampling are more preferred.

Response Generation Text Generation

Denoising based Sequence-to-Sequence Pre-training for Text Generation

no code implementations IJCNLP 2019 Liang Wang, Wei Zhao, Ruoyu Jia, Sujian Li, Jingming Liu

This paper presents a new sequence-to-sequence (seq2seq) pre-training method PoDA (Pre-training of Denoising Autoencoders), which learns representations suitable for text generation tasks.

Abstractive Text Summarization Denoising +2

Cannot find the paper you are looking for? You can Submit a new open access paper.