1 code implementation • 23 Jan 2025 • Xuerui Qiu, Jieyuan Zhang, Wenjie Wei, Honglin Cao, Junsheng Guo, Rui-Jie Zhu, Yimeng Shan, Yang Yang, Malu Zhang, Haizhou Li
To mitigate this issue, we take inspiration from mutual information entropy and propose a bi-level optimization strategy to rectify the information distribution in Q-SDSA.
1 code implementation • 10 Dec 2024 • Xuerui Qiu, Man Yao, Jieyuan Zhang, Yuhong Chou, Ning Qiao, Shibo Zhou, Bo Xu, Guoqi Li
To address this issue, we first introduce the Spike Voxel Coding (SVC) scheme, which encodes the 3D point clouds into a sparse spike train space, reducing the storage requirements and saving time on point cloud preprocessing.
1 code implementation • 25 Nov 2024 • Man Yao, Xuerui Qiu, Tianxiang Hu, Jiakui Hu, Yuhong Chou, Keyu Tian, Jianxing Liao, Luziwei Leng, Bo Xu, Guoqi Li
This work enables SNNs to match ANN performance while maintaining the low-power advantage, marking a significant step towards SNNs as a general visual backbone.
1 code implementation • 22 Aug 2024 • Haopeng Li, Jinyue Yang, Kexin Wang, Xuerui Qiu, Yuhong Chou, Xin Li, Guoqi Li
On the ImageNet1K 256*256 benchmark, our best AiM model achieves a FID of 2. 21, surpassing all existing AR models of comparable parameter counts and demonstrating significant competitiveness against diffusion models, with 2 to 10 times faster inference speed.
1 code implementation • 29 Jul 2024 • Keming Wu, Man Yao, Yuhong Chou, Xuerui Qiu, Rui Yang, Bo Xu, Guoqi Li
Spiking Neural Networks (SNNs) have received widespread attention due to their unique neuronal dynamics and low-power nature.
1 code implementation • 5 Jun 2024 • Xuerui Qiu, Zheng Luan, Zhaorui Wang, Rui-Jie Zhu
Furthermore, our ALIF neuron model achieves remarkable classification accuracy on MNIST (99. 78\%) and CIFAR-10 (93. 89\%) datasets, demonstrating the effectiveness of learning adaptive thresholds for spiking neurons.
1 code implementation • 26 May 2024 • Jiakui Hu, Man Yao, Xuerui Qiu, Yuhong Chou, Yuxuan Cai, Ning Qiao, Yonghong Tian, Bo Xu, Guoqi Li
This work is expected to break the technical bottleneck of significantly increasing memory cost and training time for large-scale SNNs while maintaining high performance and low inference energy cost.
no code implementations • 22 May 2024 • Yimeng Shan, Malu Zhang, Rui-Jie Zhu, Xuerui Qiu, Jason K. Eshraghian, Haicheng Qu
To address this issue, we have designed a Spiking Multiscale Attention (SMA) module that captures multiscale spatiotemporal interaction information.
no code implementations • 1 Mar 2024 • Wenjie Wei, Malu Zhang, Jilin Zhang, Ammar Belatreche, Jibin Wu, Zijing Xu, Xuerui Qiu, Hong Chen, Yang Yang, Haizhou Li
Specifically, we introduce two novel event-driven learning methods: the spike-timing-dependent event-driven (STD-ED) and membrane-potential-dependent event-driven (MPD-ED) algorithms.
1 code implementation • 11 Nov 2023 • Yimeng Shan, Xuerui Qiu, Rui-Jie Zhu, Jason K. Eshraghian, Malu Zhang, Haicheng Qu
As the demand for heightened performance in SNNs surges, the trend towards training deeper networks becomes imperative, while residual learning stands as a pivotal method for training deep neural networks.
1 code implementation • 23 Oct 2023 • Haoyu Deng, Ruijie Zhu, Xuerui Qiu, Yule Duan, Malu Zhang, LiangJian Deng
Then, in AMC, we exploit the inverse procedure of the tensor decomposition process to combine the three tensors into the attention map using a so-called connecting factor.
1 code implementation • 12 Aug 2023 • Xuerui Qiu, Rui-Jie Zhu, Yuhong Chou, Zhaorui Wang, Liang-Jian Deng, Guoqi Li
Experiments on CIFAR10/100 and ImageNet datasets demonstrate that GAC achieves state-of-the-art accuracy with remarkable efficiency.
Ranked #111 on
Image Classification
on CIFAR-10