1 code implementation • 2 Apr 2022 • Jing-Xiao Liao, Bo-Jian Hou, Hang-Cheng Dong, Hao Zhang, Xiaoge Zhang, Jinwei Sun, Shiping Zhang, Feng-Lei Fan
Encouraged by this inspiring theoretical result on heterogeneous networks, we directly integrate conventional and quadratic neurons in an autoencoder to make a new type of heterogeneous autoencoders.
2 code implementations • 14 Jan 2022 • Dayang Wang, Feng-Lei Fan, Bo-Jian Hou, Hao Zhang, Zhen Jia, Boce Zhou, Rongjie Lai, Hengyong Yu, Fei Wang
A neural network with the widely-used ReLU activation has been shown to partition the sample space into many convex polytopes for prediction.
no code implementations • 22 Jul 2020 • Bo-Jian Hou, Yu-Hu Yan, Peng Zhao, Zhi-Hua Zhou
Our framework is able to fit its behavior to different storage budgets when learning with feature evolvable streams with unlabeled data.
no code implementations • 27 Apr 2019 • Bo-Jian Hou, Lijun Zhang, Zhi-Hua Zhou
Learning with feature evolution studies the scenario where the features of the data streams can evolve, i. e., old features vanish and new features emerge.
no code implementations • 25 Oct 2018 • Bo-Jian Hou, Zhi-Hua Zhou
With the learned FSA and via experiments on artificial and real datasets, we find that FSA is more trustable than the RNN from which it learned, which gives FSA a chance to substitute RNNs in applications involving humans' lives or dangerous facilities.
no code implementations • NeurIPS 2017 • Bo-Jian Hou, Lijun Zhang, Zhi-Hua Zhou
To benefit from the recovered features, we develop two ensemble methods.