Search Results for author: Jiaxin Guo

Found 25 papers, 1 papers with code

Make the Blind Translator See The World: A Novel Transfer Learning Solution for Multimodal Machine Translation

no code implementations MTSummit 2021 Minghan Wang, Jiaxin Guo, Yimeng Chen, Chang Su, Min Zhang, Shimin Tao, Hao Yang

Based on large-scale pretrained networks and the liability to be easily overfitting with limited labelled training data of multimodal translation (MMT) is a critical issue in MMT.

Multimodal Machine Translation Transfer Learning +1

HW-TSC’s Submissions to the WMT21 Biomedical Translation Task

no code implementations WMT (EMNLP) 2021 Hao Yang, Zhanglin Wu, Zhengzhe Yu, Xiaoyu Chen, Daimeng Wei, Zongyao Li, Hengchao Shang, Minghan Wang, Jiaxin Guo, Lizhi Lei, Chuanfei Xu, Min Zhang, Ying Qin

This paper describes the submission of Huawei Translation Service Center (HW-TSC) to WMT21 biomedical translation task in two language pairs: Chinese↔English and German↔English (Our registered team name is HuaweiTSC).

Translation

A Visual Navigation Perspective for Category-Level Object Pose Estimation

no code implementations25 Mar 2022 Jiaxin Guo, Fangxun Zhong, Rong Xiong, Yunhui Liu, Yue Wang, Yiyi Liao

In this paper, we take a deeper look at the inference of analysis-by-synthesis from the perspective of visual navigation, and investigate what is a good navigation policy for this specific task.

Imitation Learning Pose Estimation +1

Self-Distillation Mixup Training for Non-autoregressive Neural Machine Translation

no code implementations22 Dec 2021 Jiaxin Guo, Minghan Wang, Daimeng Wei, Hengchao Shang, Yuxia Wang, Zongyao Li, Zhengzhe Yu, Zhanglin Wu, Yimeng Chen, Chang Su, Min Zhang, Lizhi Lei, Shimin Tao, Hao Yang

An effective training strategy to improve the performance of AT models is Self-Distillation Mixup (SDM) Training, which pre-trains a model on raw data, generates distilled data by the pre-trained model itself and finally re-trains a model on the combination of raw data and distilled data.

Knowledge Distillation Machine Translation +1

Joint-training on Symbiosis Networks for Deep Nueral Machine Translation models

no code implementations22 Dec 2021 Zhengzhe Yu, Jiaxin Guo, Minghan Wang, Daimeng Wei, Hengchao Shang, Zongyao Li, Zhanglin Wu, Yuxia Wang, Yimeng Chen, Chang Su, Min Zhang, Lizhi Lei, Shimin Tao, Hao Yang

Deep encoders have been proven to be effective in improving neural machine translation (NMT) systems, but it reaches the upper bound of translation quality when the number of encoder layers exceeds 18.

14 Machine Translation +1

Asymptotic in a class of network models with an increasing sub-Gamma degree sequence

no code implementations2 Nov 2021 Jing Luo, Haoyu Wei, Xiaoyu Lei, Jiaxin Guo

For the differential privacy under the sub-Gamma noise, we derive the asymptotic properties of a class of network models with binary values with general link function.

Priority prediction of Asian Hornet sighting report using machine learning methods

no code implementations28 Jun 2021 Yixin Liu, Jiaxin Guo, Jieyang Dong, Luoqian Jiang, Haoyuan Ouyang

In this paper, we propose a method to predict the priority of sighting reports based on machine learning.

3D Point-to-Keypoint Voting Network for 6D Pose Estimation

no code implementations22 Dec 2020 Weitong Hua, Jiaxin Guo, Yue Wang, Rong Xiong

In this paper, we propose a framework for 6D pose estimation from RGB-D data based on spatial structure characteristics of 3D keypoints.

6D Pose Estimation

PREGAN: Pose Randomization and Estimation for Weakly Paired Image Style Translation

1 code implementation31 Oct 2020 Zexi Chen, Jiaxin Guo, Xuecheng Xu, Yunkai Wang, Yue Wang, Rong Xiong

Utilizing the trained model under different conditions without data annotation is attractive for robot applications.

Object Detection Pose Estimation +1

SEA: A Combined Model for Heat Demand Prediction

no code implementations28 Jul 2018 Jiyang Xie, Jiaxin Guo, Zhanyu Ma, Jing-Hao Xue, Qie Sun, Hailong Li, Jun Guo

ENN and ARIMA are used to predict seasonal and trend components, respectively.

Cannot find the paper you are looking for? You can Submit a new open access paper.