no code implementations • 6 Dec 2024 • Shuren Qi, Fei Wang, Tieyong Zeng, Fenglei Fan
Integrating invariance into data representations is a principled design in intelligent systems and web applications.
1 code implementation • 6 Oct 2024 • Yanrui Du, Sendong Zhao, Jiawei Cao, Ming Ma, Danyang Zhao, Fenglei Fan, Ting Liu, Bing Qin
During IFT, the ML-LR strategy employs differentiated learning rates for Mods$_{Robust}$ and the rest modules.
1 code implementation • 1 Sep 2024 • Fenglei Fan, Juntong Fan, Dayang Wang, Jingbo Zhang, Zelin Dong, Shijun Zhang, Ge Wang, Tieyong Zeng
The rapid growth of large models' size has far outpaced that of GPU memory.
no code implementations • CVPR 2024 • Zelin Zhao, Fenglei Fan, Wenlong Liao, Junchi Yan
Many contemporary studies utilize grid-based models for neural field representation, but a systematic analysis of grid-based models is still missing, hindering the improvement of those models.
no code implementations • 29 Nov 2023 • Xiaoge Zhang, Xiao-Lin Wang, Fenglei Fan, Yiu-ming Cheung, Indranil Bose
Regarding the loss function, both intermediate and leaf nodes in the DAG graph are treated as target outputs during CINN training so as to drive co-learning of causal relationships among different types of nodes.
2 code implementations • 28 Feb 2022 • Dayang Wang, Fenglei Fan, Zhan Wu, Rui Liu, Fei Wang, Hengyong Yu
Furthermore, an overlapped inference mechanism is introduced to effectively eliminate the boundary artifacts that are common for encoder-decoder-based denoising models.
1 code implementation • 6 Nov 2020 • Chuang Niu, Mengzhou Li, Fenglei Fan, Weiwen Wu, Xiaodong Guo, Qing Lyu, Ge Wang
Limited by the independent noise assumption, current unsupervised denoising methods cannot process correlated noises as in CT images.
no code implementations • 8 Jul 2020 • Chuang Niu, Wenxiang Cong, Fenglei Fan, Hongming Shan, Mengzhou Li, Jimin Liang, Ge Wang
Deep neural network based methods have achieved promising results for CT metal artifact reduction (MAR), most of which use many synthesized paired images for training.
1 code implementation • 8 Jan 2020 • Fenglei Fan, JinJun Xiong, Mengzhou Li, Ge Wang
Deep learning as represented by the artificial deep neural networks (DNNs) has achieved great success in many important areas that deal with text, images, videos, graphs, and so on.
1 code implementation • 17 Jan 2019 • Fenglei Fan, Hongming Shan, Mannudeep K. Kalra, Ramandeep Singh, Guhan Qian, Matthew Getzin, Yueyang Teng, Juergen Hahn, Ge Wang
Inspired by complexity and diversity of biological neurons, our group proposed quadratic neurons by replacing the inner product in current artificial neurons with a quadratic operation on input data, thereby enhancing the capability of an individual neuron.
no code implementations • 31 Dec 2018 • Fenglei Fan, Mengzhou Li, Yueyang Teng, Ge Wang
Recently, deep learning becomes the main focus of machine learning research and has greatly impacted many important fields.
1 code implementation • 22 Nov 2018 • Fenglei Fan, Dayang Wang, Hengtao Guo, Qikui Zhu, Pingkun Yan, Ge Wang, Hengyong Yu
In this paper, we investigate the expressivity and generalizability of a novel sparse shortcut topology.
no code implementations • 31 Jul 2018 • Fenglei Fan, JinJun Xiong, Ge Wang
(4) To approximate the same class of functions with the same error bound, is a quantized quadratic network able to enjoy a lower number of weights than a quantized conventional network?
no code implementations • 4 Jul 2018 • Fenglei Fan, Ge Wang
Since traditional neural networks and second-order counterparts can represent each other and fuzzy logic operations are naturally implemented in second-order neural networks, it is plausible to explain how a deep neural network works with a second-order network as the system model.
no code implementations • 15 Feb 2018 • Fenglei Fan, Ziyu Su, Yueyang Teng, Ge Wang
For manifold learning, it is assumed that high-dimensional sample/data points are embedded on a low-dimensional manifold.
no code implementations • 5 Jan 2018 • Fenglei Fan, Ge Wang
Inspired by the fact that the neural network, as the mainstream for machine learning, has brought successes in many application areas, here we propose to use this approach for decoding hidden correlation among pseudo-random data and predicting events accordingly.
no code implementations • 17 Aug 2017 • Fenglei Fan, Wenxiang Cong, Ge Wang
The artificial neural network is a popular framework in machine learning.
no code implementations • 26 Apr 2017 • Fenglei Fan, Wenxiang Cong, Ge Wang
Here we investigate the possibility of replacing the inner product with a quadratic function of the input vector, thereby upgrading the 1st order neuron to the 2nd order neuron, empowering individual neurons, and facilitating the optimization of neural networks.