Search Results for author: Xiaobing Li

Found 6 papers, 2 papers with code

Beyond Language Models: Byte Models are Digital World Simulators

no code implementations29 Feb 2024 Shangda Wu, Xu Tan, Zili Wang, Rui Wang, Xiaobing Li, Maosong Sun

Traditional deep learning often overlooks bytes, the basic units of the digital world, where all forms of information and operations are encoded and manipulated in binary format.

Symphony Generation with Permutation Invariant Language Model

1 code implementation10 May 2022 Jiafeng Liu, Yuanliang Dong, Zehua Cheng, Xinran Zhang, Xiaobing Li, Feng Yu, Maosong Sun

In this work, we propose a permutation invariant language model, SymphonyNet, as a solution for symbolic symphony music generation.

Audio Generation Language Modelling +2

Chord-Conditioned Melody Harmonization with Controllable Harmonicity

1 code implementation17 Feb 2022 Shangda Wu, Xiaobing Li, Maosong Sun

Melody harmonization has long been closely associated with chorales composed by Johann Sebastian Bach.

Lingxi: A Diversity-aware Chinese Modern Poetry Generation System

no code implementations27 Aug 2021 Xinran Zhang, Maosong Sun, Jiafeng Liu, Xiaobing Li

We propose nucleus sampling with randomized head (NS-RH) algorithm, which randomizes the high frequency part ("head") of the predicted distribution, in order to emphasize on the "comparatively low frequency" words.

Semantic Similarity Semantic Textual Similarity +2

Optimal Embedding Calibration for Symbolic Music Similarity

no code implementations13 Mar 2021 Xinran Zhang, Maosong Sun, Jiafeng Liu, Xiaobing Li

In natural language processing (NLP), the semantic similarity task requires large-scale, high-quality human-annotated labels for fine-tuning or evaluation.

Language Modelling Representation Learning +2

Improving Diversity of Neural Text Generation via Inverse Probability Weighting

no code implementations13 Mar 2021 Xinran Zhang, Maosong Sun, Jiafeng Liu, Xiaobing Li

Traditional stochastic sampling methods only focus on truncating the unreliable "tail" of the distribution, and do not address the "head" part, which we show might contain tedious or even repetitive candidates with high probability that lead to repetition loops.

Language Modelling Text Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.