1 code implementation • 19 Feb 2024 • Mingqing Xiao, Qingyan Meng, Zongpeng Zhang, Di He, Zhouchen Lin
Neuromorphic computing with spiking neural networks is promising for energy-efficient artificial intelligence (AI) applications.
1 code implementation • 1 Feb 2023 • Mingqing Xiao, Qingyan Meng, Zongpeng Zhang, Yisen Wang, Zhouchen Lin
In this paper, we study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method, implicit differentiation on the equilibrium state (IDE), for supervised learning with purely spike-based computation, which demonstrates the potential for energy-efficient training of SNNs.
1 code implementation • 9 Oct 2022 • Mingqing Xiao, Qingyan Meng, Zongpeng Zhang, Di He, Zhouchen Lin
With OTTT, it is the first time that two mainstream supervised SNN training methods, BPTT with SG and spike representation-based training, are connected, and meanwhile in a biologically plausible form.
Ranked #3 on Event data classification on CIFAR10-DVS
no code implementations • 27 May 2022 • Zenan Ling, Xingyu Xie, Qiuhao Wang, Zongpeng Zhang, Zhouchen Lin
A deep equilibrium model (DEQ) is implicitly defined through an equilibrium point of an infinite-depth weight-tied model with an input-injection.
1 code implementation • NeurIPS 2021 • Mingqing Xiao, Qingyan Meng, Zongpeng Zhang, Yisen Wang, Zhouchen Lin
In this work, we consider feedback spiking neural networks, which are more brain-like, and propose a novel training method that does not rely on the exact reverse of the forward computation.