1 code implementation • 10 Nov 2021 • Xiangru Lian, Binhang Yuan, XueFeng Zhu, Yulong Wang, Yongjun He, Honghuan Wu, Lei Sun, Haodong Lyu, Chengjun Liu, Xing Dong, Yiqiao Liao, Mingnan Luo, Congfei Zhang, Jingru Xie, Haonan Li, Lei Chen, Renjie Huang, Jianying Lin, Chengchun Shu, Xuezhong Qiu, Zhishan Liu, Dongying Kong, Lei Yuan, Hai Yu, Sen yang, Ce Zhang, Ji Liu
Specifically, in order to ensure both the training efficiency and the training accuracy, we design a novel hybrid training algorithm, where the embedding layer and the dense neural network are handled by different synchronization mechanisms; then we build a system called Persia (short for parallel recommendation training system with hybrid acceleration) to support this hybrid training algorithm.
no code implementations • 8 Jun 2020 • Yang Hu, Guihua Wen, Adriane Chapman, Pei Yang, Mingnan Luo, Yingxue Xu, Dan Dai, Wendy Hall
Zero-shot learning uses semantic attributes to connect the search space of unseen objects.
no code implementations • 13 May 2020 • Yingxue Xu, Guihua Wen, Yang Hu, Mingnan Luo, Dan Dai, Yishan Zhuang, Wendy Hall
Finally, a new framework for Chinese herbal recognition is proposed as a new application of APN.
no code implementations • 22 Apr 2019 • Mingnan Luo, Guihua Wen, Yang Hu, Dan Dai, Yingxue Xu
Global Average Pooling (GAP) is used by default on the channel-wise attention mechanism to extract channel descriptors.
1 code implementation • 22 Apr 2019 • Yang Hu, Guihua Wen, Mingnan Luo, Dan Dai, Wenming Cao, Zhiwen Yu, Wendy Hall
To deal with these problems, a novel Inner-Imaging architecture is proposed in this paper, which allows relationships between channels to meet the above requirement.
no code implementations • 23 Dec 2018 • Yingxue Xu, Guihua Wen, Yang Hu, Mingnan Luo, Dan Dai, Yishan Zhuang
According to the characteristics of herbal images, we proposed the competitive attentional fusion pyramid networks to model the features of herbal image, which mdoels the relationship of feature maps from different levels, and re-weights multi-level channels with channel-wise attention mechanism.
no code implementations • 4 Sep 2018 • Dan Dai, Zhiwen Yu, Yang Hu, Wenming Cao, Mingnan Luo
It is self-evident that the significance of metabolize neuronal network(MetaNet) in model construction.
1 code implementation • 24 Jul 2018 • Yang Hu, Guihua Wen, Mingnan Luo, Dan Dai, Jiajiong Ma, Zhiwen Yu
In this work, we propose a competitive squeeze-excitation (SE) mechanism for the residual network.