1 code implementation • 11 Oct 2019 • Shuai Li, Wanqing Li, Chris Cook, Yanbo Gao
Recurrent neural networks (RNNs) are known to be difficult to train due to the gradient vanishing and exploding problems and thus difficult to learn long-term patterns and construct deep networks.
no code implementations • 16 Apr 2018 • Shuai Li, Dinei Florencio, Wanqing Li, Yaqin Zhao, Chris Cook
Conventional methods cannot distinguish the foreground from background due to the small differences between them and thus suffer from under-detection of the camouflaged foreground objects.
11 code implementations • CVPR 2018 • Shuai Li, Wanqing Li, Chris Cook, Ce Zhu, Yanbo Gao
Experimental results have shown that the proposed IndRNN is able to process very long sequences (over 5000 time steps), can be used to construct very deep networks (21 layers used in the experiment) and still be trained robustly.
Ranked #10 on
Language Modelling
on Penn Treebank (Character Level)
no code implementations • 11 Jul 2017 • Shuai Li, Dinei Florencio, Yaqin Zhao, Chris Cook, Wanqing Li
This paper proposes a texture guided weighted voting (TGWV) method which can efficiently detect foreground objects in camouflaged scenes.
no code implementations • 16 Jun 2017 • Shuai Li, Wanqing Li, Chris Cook, Ce Zhu, Yanbo Gao
Such a network with learnable pooling function is referred to as a fully trainable network (FTN).