Search Results for author: Zaixiang Zheng

Found 20 papers, 10 papers with code

Structure-informed Language Models Are Protein Designers

1 code implementation3 Feb 2023 Zaixiang Zheng, Yifan Deng, Dongyu Xue, Yi Zhou, Fei Ye, Quanquan Gu

This paper demonstrates that language models are strong structure-based protein designers.

Diffusion Language Models Can Perform Many Tasks with Scaling and Instruction-Finetuning

1 code implementation23 Aug 2023 Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Quanquan Gu

We then reprogram pretrained masked language models into diffusion language models via diffusive adaptation, wherein task-specific finetuning and instruction finetuning are explored to unlock their versatility in solving general language tasks.

In-Context Learning Language Modelling +1

DINOISER: Diffused Conditional Sequence Learning by Manipulating Noises

1 code implementation20 Feb 2023 Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Mingxuan Wang

In this paper, we introduce DINOISER to facilitate diffusion models for sequence generation by manipulating noises.

Towards Making the Most of Context in Neural Machine Translation

1 code implementation19 Feb 2020 Zaixiang Zheng, Xiang Yue, Shu-Jian Huang, Jia-Jun Chen, Alexandra Birch

Document-level machine translation manages to outperform sentence level models by a small margin, but have failed to be widely adopted.

Document Level Machine Translation Machine Translation +3

Modeling Past and Future for Neural Machine Translation

1 code implementation TACL 2018 Zaixiang Zheng, Hao Zhou, Shu-Jian Huang, Lili Mou, Xin-yu Dai, Jia-Jun Chen, Zhaopeng Tu

The Past and Future contents are fed to both the attention model and the decoder states, which offers NMT systems the knowledge of translated and untranslated contents.

Machine Translation NMT +1

Dynamic Past and Future for Neural Machine Translation

1 code implementation IJCNLP 2019 Zaixiang Zheng, Shu-Jian Huang, Zhaopeng Tu, Xin-yu Dai, Jia-Jun Chen

Previous studies have shown that neural machine translation (NMT) models can benefit from explicitly modeling translated (Past) and untranslated (Future) to groups of translated and untranslated contents through parts-to-wholes assignment.

Machine Translation NMT +1

Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators

1 code implementation11 Nov 2022 Xinyou Wang, Zaixiang Zheng, ShuJian Huang

Recently, non-autoregressive (NAR) neural machine translation models have received increasing attention due to their efficient parallel decoding.

Machine Translation Multi-Task Learning

Neural Machine Translation with Word Predictions

no code implementations EMNLP 2017 Rongxiang Weng, Shu-Jian Huang, Zaixiang Zheng, Xin-yu Dai, Jia-Jun Chen

In the encoder-decoder architecture for neural machine translation (NMT), the hidden states of the recurrent structures in the encoder and decoder carry the crucial information about the sentence. These vectors are generated by parameters which are updated by back-propagation of translation errors through time.

Machine Translation NMT +2

Learning to Discriminate Noises for Incorporating External Information in Neural Machine Translation

no code implementations24 Oct 2018 Zaixiang Zheng, Shu-Jian Huang, Zewei Sun, Rongxiang Weng, Xin-yu Dai, Jia-Jun Chen

Previous studies show that incorporating external information could improve the translation quality of Neural Machine Translation (NMT) systems.

Machine Translation NMT +2

RPD: A Distance Function Between Word Embeddings

no code implementations ACL 2020 Xuhui Zhou, Zaixiang Zheng, Shu-Jian Huang

Based on the properties of RPD, we study the relations of word embeddings of different algorithms systematically and investigate the influence of different training processes and corpora.

Word Embeddings

Mirror-Generative Neural Machine Translation

no code implementations ICLR 2020 Zaixiang Zheng, Hao Zhou, Shu-Jian Huang, Lei LI, Xin-yu Dai, Jia-Jun Chen

Training neural machine translation models (NMT) requires a large amount of parallel corpus, which is scarce for many language pairs.

Machine Translation NMT +1

Information-theoretic Vocabularization via Optimal Transport

no code implementations1 Jan 2021 Jingjing Xu, Hao Zhou, Chun Gan, Zaixiang Zheng, Lei LI

In this paper, we find an exciting relation between an information-theoretic feature and the performance of NLP tasks such as machine translation with a given vocabulary.

Machine Translation Translation

DirectQE: Direct Pretraining for Machine Translation Quality Estimation

no code implementations15 May 2021 Qu Cui, ShuJian Huang, Jiahuan Li, Xiang Geng, Zaixiang Zheng, Guoping Huang, Jiajun Chen

However, we argue that there are gaps between the predictor and the estimator in both data quality and training objectives, which preclude QE models from benefiting from a large number of parallel corpora more directly.

Machine Translation Translation

Diffusion Language Models Are Versatile Protein Learners

no code implementations28 Feb 2024 Xinyou Wang, Zaixiang Zheng, Fei Ye, Dongyu Xue, ShuJian Huang, Quanquan Gu

This paper introduces diffusion protein language model (DPLM), a versatile protein language model that demonstrates strong generative and predictive capabilities for protein sequences.

Protein Language Model

Antigen-Specific Antibody Design via Direct Energy-based Preference Optimization

no code implementations25 Mar 2024 Xiangxin Zhou, Dongyu Xue, Ruizhe Chen, Zaixiang Zheng, Liang Wang, Quanquan Gu

Antibody design, a crucial task with significant implications across various disciplines such as therapeutics and biology, presents considerable challenges due to its intricate nature.

Total Energy

Cannot find the paper you are looking for? You can Submit a new open access paper.