no code implementations • 17 Aug 2023 • Minzheng Li, Xiangzhong Fang, Haixin Yang
We envision a machine capable of solving mathematical problems.
1 code implementation • CVPR 2023 • Linglan Zhao, Jing Lu, Yunlu Xu, Zhanzhan Cheng, Dashan Guo, Yi Niu, Xiangzhong Fang
While knowledge distillation, a prevailing technique in CIL, can alleviate the catastrophic forgetting of older classes by regularizing outputs between current and previous model, it fails to consider the overfitting risk of novel classes in FSCIL.
1 code implementation • 27 Sep 2022 • Zhengyan Tong, Xiaohang Wang, Shengchao Yuan, Xuanhong Chen, Junjie Wang, Xiangzhong Fang
Comparison with existing state-of-the-art oil painting techniques shows that our results have higher fidelity and more realistic textures.
no code implementations • 21 Oct 2021 • Linlan Zhao, Dashan Guo, Yunlu Xu, Liang Qiao, Zhanzhan Cheng, ShiLiang Pu, Yi Niu, Xiangzhong Fang
Few-shot learning (FSL) aims to learn models that generalize to novel classes with limited training samples.
no code implementations • 3 Jul 2019 • Wei Li, Zehuan Yuan, Dashan Guo, Lei Huang, Xiangzhong Fang, Changhu Wang
To perform action detection, we design a 3D convolution network with skip connections for tube classification and regression.
no code implementations • 9 Oct 2018 • Wei Li, Zehuan Yuan, Xiangzhong Fang, Changhu Wang
Attention mechanisms have been widely used in Visual Question Answering (VQA) solutions due to their capacity to model deep cross-domain interactions.