no code implementations • 9 Mar 2024 • Qu Yang, Qianhui Liu, Nan Li, Meng Ge, Zeyang Song, Haizhou Li
Spiking Neural Networks (SNNs) are known to be biologically plausible and power-efficient.
no code implementations • 23 Oct 2023 • Qu Yang, Malu Zhang, Jibin Wu, Kay Chen Tan, Haizhou Li
With TTFS coding, we can achieve up to orders of magnitude saving in computation over ANN and other rate-based SNNs.
1 code implementation • 25 Aug 2023 • Shimin Zhang, Qu Yang, Chenxiang Ma, Jibin Wu, Haizhou Li, Kay Chen Tan
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
no code implementations • 14 Jul 2023 • Shimin Zhang, Qu Yang, Chenxiang Ma, Jibin Wu, Haizhou Li, Kay Chen Tan
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
1 code implementation • 26 May 2023 • Xinyi Chen, Qu Yang, Jibin Wu, Haizhou Li, Kay Chen Tan
As an initial exploration in this direction, we propose a hybrid neural coding and learning framework, which encompasses a neural coding zoo with diverse neural coding schemes discovered in neuroscience.
1 code implementation • 10 Oct 2022 • Qu Yang, Jibin Wu, Malu Zhang, Yansong Chua, Xinchao Wang, Haizhou Li
The LTL rule follows the teacher-student learning approach by mimicking the intermediate feature representations of a pre-trained ANN.
no code implementations • 15 Feb 2019 • Jibin Wu, Yansong Chua, Malu Zhang, Qu Yang, Guoqi Li, Haizhou Li
Deep spiking neural networks (SNNs) support asynchronous event-driven computation, massive parallelism and demonstrate great potential to improve the energy efficiency of its synchronous analog counterpart.