Search Results for author: Shikuang Deng

Found 6 papers, 4 papers with code

Converting Artificial Neural Networks to Spiking Neural Networks via Parameter Calibration

1 code implementation6 May 2022 Yuhang Li, Shikuang Deng, Xin Dong, Shi Gu

We demonstrate that our method can handle the SNN conversion with batch normalization layers and effectively preserve the high accuracy even in 32 time steps.

Temporal Efficient Training of Spiking Neural Network via Gradient Re-weighting

1 code implementation ICLR 2022 Shikuang Deng, Yuhang Li, Shanghang Zhang, Shi Gu

Then we introduce the temporal efficient training (TET) approach to compensate for the loss of momentum in the gradient descent with SG so that the training process can converge into flatter minima with better generalizability.

Control Theory Illustrates the Energy Efficiency in the Dynamic Reconfiguration of Functional Connectivity

no code implementations7 Jan 2022 Shikuang Deng, Jingwei Li, B. T. Thomas Yeo, Shi Gu

The brain's functional connectivity fluctuates over time instead of remaining steady in a stationary mode even during the resting state.

Differentiable Spike: Rethinking Gradient-Descent for Training Spiking Neural Networks

no code implementations NeurIPS 2021 Yuhang Li, Yufei Guo, Shanghang Zhang, Shikuang Deng, Yongqing Hai, Shi Gu

Based on the introduced finite difference gradient, we propose a new family of Differentiable Spike (Dspike) functions that can adaptively evolve during training to find the optimal shape and smoothness for gradient estimation.

Event data classification Image Classification

A Free Lunch From ANN: Towards Efficient, Accurate Spiking Neural Networks Calibration

1 code implementation13 Jun 2021 Yuhang Li, Shikuang Deng, Xin Dong, Ruihao Gong, Shi Gu

Moreover, our calibration algorithm can produce SNN with state-of-the-art architecture on the large-scale ImageNet dataset, including MobileNet and RegNet.

Optimal Conversion of Conventional Artificial Neural Networks to Spiking Neural Networks

1 code implementation ICLR 2021 Shikuang Deng, Shi Gu

As an alternative, many efforts have been devoted to converting conventional ANNs into SNNs by copying the weights from ANNs and adjusting the spiking threshold potential of neurons in SNNs.

Cannot find the paper you are looking for? You can Submit a new open access paper.