no code implementations • 19 Sep 2024 • Xian Zhong, Shengwang Hu, Wenxuan Liu, Wenxin Huang, Jianhao Ding, Zhaofei Yu, Tiejun Huang
In this paper, we propose Hybrid Step-wise Distillation (HSD) method, tailored for neuromorphic datasets, to mitigate the notable decline in performance at lower time steps.
1 code implementation • 31 May 2024 • Jianhao Ding, Zhiyu Pan, Yujia Liu, Zhaofei Yu, Tiejun Huang
We present that membrane potential perturbation dynamics can reliably convey the intensity of perturbation.
no code implementations • 30 May 2024 • Yujia Liu, Tong Bu, Jianhao Ding, Zecheng Hao, Tiejun Huang, Zhaofei Yu
In this paper, we propose a novel approach to enhance the robustness of SNNs through gradient sparsity regularization.
no code implementations • 26 Apr 2024 • Zhipeng Huang, Jianhao Ding, Zhiyu Pan, Haoran Li, Ying Fang, Zhaofei Yu, Jian K. Liu
One of the mainstream approaches to implementing deep SNNs is the ANN-SNN conversion, which integrates the efficient training strategy of ANNs with the energy-saving potential and fast inference capability of SNNs.
1 code implementation • CVPR 2024 • Yujia Liu, Chenxi Yang, Dingquan Li, Jianhao Ding, Tingting Jiang
To be specific, we present theoretical evidence showing that the magnitude of score changes is related to the $\ell_1$ norm of the model's gradient with respect to the input image.
1 code implementation • 25 Oct 2023 • Wei Fang, Yanqi Chen, Jianhao Ding, Zhaofei Yu, Timothée Masquelier, Ding Chen, Liwei Huang, Huihui Zhou, Guoqi Li, Yonghong Tian
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties.
no code implementations • 9 Jun 2023 • Jianhao Ding, Zhaofei Yu, Tiejun Huang, Jian K. Liu
The success of deep learning in the past decade is partially shrouded in the shadow of adversarial attacks.
1 code implementation • 21 Mar 2023 • Yajing Zheng, Jiyuan Zhang, Rui Zhao, Jianhao Ding, Shiyan Chen, Ruiqin Xiong, Zhaofei Yu, Tiejun Huang
SpikeCV focuses on encapsulation for spike data, standardization for dataset interfaces, modularization for vision tasks, and real-time applications for challenging scenes.
2 code implementations • ICLR 2022 • Tong Bu, Wei Fang, Jianhao Ding, Penglin Dai, Zhaofei Yu, Tiejun Huang
In this paper, we theoretically analyze ANN-SNN conversion error and derive the estimated activation function of SNNs.
2 code implementations • 21 Feb 2023 • Zecheng Hao, Jianhao Ding, Tong Bu, Tiejun Huang, Zhaofei Yu
The experimental results show that our proposed method achieves state-of-the-art performance on CIFAR-10, CIFAR-100, and ImageNet datasets.
2 code implementations • 4 Feb 2023 • Zecheng Hao, Tong Bu, Jianhao Ding, Tiejun Huang, Zhaofei Yu
Spiking Neural Networks (SNNs) have received extensive academic attention due to the unique properties of low power consumption and high-speed computing on neuromorphic chips.
1 code implementation • CVPR 2023 • Tong Bu, Jianhao Ding, Zecheng Hao, Zhaofei Yu
Spiking Neural Networks (SNNs) have attracted significant attention due to their energy-efficient properties and potential application on neuromorphic hardware.
no code implementations • 3 Feb 2022 • Tong Bu, Jianhao Ding, Zhaofei Yu, Tiejun Huang
We evaluate our algorithm on the CIFAR-10, CIFAR-100 and ImageNet datasets and achieve state-of-the-art accuracy, using fewer time-steps.
no code implementations • 29 Sep 2021 • Jianhao Ding, Jiyuan Zhang, Zhaofei Yu, Tiejun Huang
Despite that spiking neural networks (SNNs) show strong advantages in information encoding, power consuming, and computational capability, the underdevelopment of supervised learning algorithms is still a hindrance for training SNN.
1 code implementation • 25 May 2021 • Jianhao Ding, Zhaofei Yu, Yonghong Tian, Tiejun Huang
We show that the inference time can be reduced by optimizing the upper bound of the fit curve in the revised ANN to achieve fast inference.
no code implementations • 19 Feb 2020 • Jianhao Ding, Lansheng Han
With the inflation of the data, clustering analysis, as a branch of unsupervised learning, lacks unified understanding and application of its mathematical law.