no code implementations • 30 Jul 2024 • Zhuo Chen, De Ma, Xiaofei Jin, Qinghui Xing, Ouwen Jin, Xin Du, Shuibing He, Gang Pan
In this paper, we propose an asynchronous architecture for Spiking Neural Networks (SNNs) that eliminates the need for inter-core synchronization, thus enhancing speed and energy efficiency.
1 code implementation • 19 Mar 2024 • ZiMing Wang, Ziling Wang, Huaning Li, Lang Qin, Runhao Jiang, De Ma, Huajin Tang
Spiking Neural Networks (SNNs), which operate on an event-driven paradigm through sparse spike communication, emerge as a natural fit for addressing this challenge.
1 code implementation • 25 Jan 2024 • Wei Guo, Yuqi Zhang, De Ma, Qian Zheng
Recent advancement in computer vision has significantly lowered the barriers to artistic creation.
no code implementations • 29 Dec 2023 • De Ma, Xiaofei Jin, Shichun Sun, Yitao Li, Xundong Wu, Youneng Hu, Fangchao Yang, Huajin Tang, Xiaolei Zhu, Peng Lin, Gang Pan
The Darwin3 chip supports up to 2. 35 million neurons, making it the largest of its kind in neuron scale.
1 code implementation • 12 Oct 2022 • Lang Feng, Qianhui Liu, Huajin Tang, De Ma, Gang Pan
Spiking neural networks (SNNs) are bio-inspired neural networks with asynchronous discrete and sparse characteristics, which have increasingly manifested their superiority in low energy consumption.
no code implementations • 10 Nov 2018 • Ming Zhang, Nenggan Zheng, De Ma, Gang Pan, Zonghua Gu
A Spiking Neural Network (SNN) can be trained indirectly by first training an Artificial Neural Network (ANN) with the conventional backpropagation algorithm, then converting it into an SNN.
no code implementations • 22 Oct 2018 • Qingquan Li, Qin Zou, De Ma, Qian Wang, Song Wang
Cultural heritage is the asset of all the peoples of the world.