Search Results for author: Pei Zhang

Found 23 papers, 3 papers with code

Alibaba Speech Translation Systems for IWSLT 2018

no code implementations IWSLT (EMNLP) 2018 Nguyen Bach, Hongjie Chen, Kai Fan, Cheung-Chi Leung, Bo Li, Chongjia Ni, Rong Tong, Pei Zhang, Boxing Chen, Bin Ma, Fei Huang

This work describes the En→De Alibaba speech translation system developed for the evaluation campaign of the International Workshop on Spoken Language Translation (IWSLT) 2018.

Translation

PigV$^2$: Monitoring Pig Vital Signs through Ground Vibrations Induced by Heartbeat and Respiration

no code implementations7 Dec 2022 Yiwen Dong, Jesse R Codling, Gary Rohrer, Jeremy Miles, Sudhendu Sharma, Tami Brown-Brandl, Pei Zhang, Hae Young Noh

In this paper, we introduce PigV$^2$, the first system to monitor pig heart rate and respiratory rate through ground vibrations.

Graph Anomaly Detection via Multi-Scale Contrastive Learning Networks with Augmented View

no code implementations1 Dec 2022 Jingcan Duan, Siwei Wang, Pei Zhang, En Zhu, Jingtao Hu, Hu Jin, Yue Liu, Zhibin Dong

However, they neglect the subgraph-subgraph comparison information which the normal and abnormal subgraph pairs behave differently in terms of embeddings and structures in GAD, resulting in sub-optimal task performance.

Anomaly Detection Contrastive Learning

Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules

no code implementations22 Jul 2022 Jong Youl Choi, Pei Zhang, Kshitij Mehta, Andrew Blanchard, Massimiliano Lupo Pasini

Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures.

Distributed Computing Management

Multi-task graph neural networks for simultaneous prediction of global and atomic properties in ferromagnetic systems

no code implementations4 Feb 2022 Massimiliano Lupo Pasini, Pei Zhang, Samuel Temple Reeve, Jong Youl Choi

We train HydraGNN on an open-source ab initio density functional theory (DFT) dataset for iron-platinum (FePt) with a fixed body centered tetragonal (BCT) lattice structure and fixed volume to simultaneously predict the mixing enthalpy (a global feature of the system), the atomic charge transfer, and the atomic magnetic moment across configurations that span the entire compositional range.

Multi-Task Learning

Sharp Attention for Sequence to Sequence Learning

no code implementations29 Sep 2021 Pei Zhang, Hua Liu

Attention mechanism has been widely applied to tasks that output some sequence from an input image.

Hard Attention Scene Text Recognition

PI3NN: Out-of-distribution-aware prediction intervals from three neural networks

1 code implementation ICLR 2022 Siyan Liu, Pei Zhang, Dan Lu, Guannan Zhang

First, existing PI methods require retraining of neural networks (NNs) for every given confidence level and suffer from the crossing issue in calculating multiple PIs.

Prediction Intervals

A Data-driven feature selection and machine-learning model benchmark for the prediction of longitudinal dispersion coefficient

no code implementations16 Jul 2021 Yifeng Zhao, Pei Zhang, S. A. Galindo-Torres, Stan Z. Li

Then, a global optimal feature set (the channel width, the flow velocity, the channel slope and the cross sectional area) was proposed through numerical comparison of the distilled local optimums in performance with representative ML models.

Ensemble Learning

Context-Interactive Pre-Training for Document Machine Translation

no code implementations NAACL 2021 Pengcheng Yang, Pei Zhang, Boxing Chen, Jun Xie, Weihua Luo

Document machine translation aims to translate the source sentence into the target language in the presence of additional contextual information.

Machine Translation Translation

Multi-view Clustering with Deep Matrix Factorization and Global Graph Refinement

no code implementations1 May 2021 Chen Zhang, Siwei Wang, Wenxuan Tu, Pei Zhang, Xinwang Liu, Changwang Zhang, Bo Yuan

Multi-view clustering is an important yet challenging task in machine learning and data mining community.

Retrieving High-Dimensional Quantum Steering From a Noisy Environment with N Measurement Settings

no code implementations12 Jan 2021 Rui Qu, Yunlong Wang, Min An, Feiran Wang, Hongrong Li, Hong Gao, Fuli Li, Pei Zhang

One of the most often implied benefits of high-dimensional (HD) quantum systems is to lead to stronger forms of correlations, featuring increased robustness to noise.

Quantum Physics

Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation

no code implementations EMNLP 2020 Pei Zhang, Boxing Chen, Niyu Ge, Kai Fan

In this paper, we research extensively the pros and cons of the standard transformer in document-level translation, and find that the auto-regressive property can simultaneously bring both the advantage of the consistency and the disadvantage of error accumulation.

Machine Translation NMT +1

Extending Label Smoothing Regularization with Self-Knowledge Distillation

no code implementations11 Sep 2020 Ji-Yue Wang, Pei Zhang, Wen-feng Pang, Jie Li

The experiment results confirm that the TC can help LsrKD and MrKD to boost training, especially on the networks they are failed.

Self-Knowledge Distillation

Learning Contextualized Sentence Representations for Document-Level Neural Machine Translation

no code implementations30 Mar 2020 Pei Zhang, Xu Zhang, Wei Chen, Jian Yu, Yan-Feng Wang, Deyi Xiong

In this paper, we propose a new framework to model cross-sentence dependencies by training neural machine translation (NMT) to predict both the target translation and surrounding sentences of a source sentence.

Document Level Machine Translation Machine Translation +3

Visual Agreement Regularized Training for Multi-Modal Machine Translation

no code implementations27 Dec 2019 Pengcheng Yang, Boxing Chen, Pei Zhang, Xu sun

Further analysis demonstrates that the proposed regularized training can effectively improve the agreement of attention on the image, leading to better use of visual information.

Machine Translation Translation

Lattice Transformer for Speech Translation

no code implementations ACL 2019 Pei Zhang, Boxing Chen, Niyu Ge, Kai Fan

Recent advances in sequence modeling have highlighted the strengths of the transformer architecture, especially in achieving state-of-the-art machine translation results.

Automatic Speech Recognition Machine Translation +2

PSA: A novel optimization algorithm based on survival rules of porcellio scaber

no code implementations28 Sep 2017 Yinyan Zhang, Pei Zhang, Shuai Li

Bio-inspired algorithms such as neural network algorithms and genetic algorithms have received a significant amount of attention in both academic and engineering societies.

Cannot find the paper you are looking for? You can Submit a new open access paper.