no code implementations • WMT (EMNLP) 2021 • Hengchao Shang, Ting Hu, Daimeng Wei, Zongyao Li, Jianfei Feng, Zhengzhe Yu, Jiaxin Guo, Shaojun Li, Lizhi Lei, Shimin Tao, Hao Yang, Jun Yao, Ying Qin
This paper presents the submission of Huawei Translation Services Center (HW-TSC) to WMT 2021 Efficiency Shared Task.
no code implementations • 24 Feb 2022 • Zengfu Hou, Siyuan Cheng, Ting Hu
In hyperspectral, high-quality spectral signals convey subtle spectral differences to distinguish similar materials, thereby providing unique advantage for anomaly detection.
no code implementations • 27 Jan 2022 • Ting Hu
To learn the spatial response function and the point spread function from the image pairs to be fused, we propose a Dirichlet network, where both functions are properly constrained.
no code implementations • 29 Sep 2021 • Ryan Zhou, Christian Muise, Ting Hu
A key property of this representation is that there are multiple representations of a network which can be obtained by permuting the order of the neurons.
no code implementations • LREC 2020 • Jonathan Sauder, Ting Hu, Xiaoyin Che, Goncalo Mordido, Haojin Yang, Christoph Meinel
Recently, various approaches with Generative Adversarial Nets (GANs) have also been proposed.
no code implementations • 3 Feb 2019 • Yunwen Lei, Ting Hu, Guiying Li, Ke Tang
While the behavior of SGD is well understood in the convex learning setting, the existing theoretical results for SGD applied to nonconvex objective functions are far from mature.
no code implementations • 17 Dec 2014 • Jun Fan, Ting Hu, Qiang Wu, Ding-Xuan Zhou
The error entropy consistency, which requires the error entropy of the learned function to approximate the minimum error entropy, is shown to be always true if the bandwidth parameter tends to 0 at an appropriate rate.