no code implementations • 29 Mar 2024 • Duzhen Zhang, Qingyu Wang, Tielin Zhang, Bo Xu
Diverging from the conventional direct linear weighted sum, the BPT-SAN models the local nonlinearities of dendritic trees within the inter-layer connections.
no code implementations • 21 Nov 2023 • Xuanle Zhao, Yue Sun, Tielin Zhang, Bo Xu
One of the most notable methods is the Fourier Neural Operator (FNO), which is inspired by Green's function method and approximate operator kernel directly in the frequency domain.
no code implementations • 2 Aug 2023 • Qingyu Wang, Duzhen Zhang, Tielin Zhang, Bo Xu
The results indicate that compared to the SOTA Spikformer with SSA, Spikformer with LT achieves higher Top-1 accuracy on neuromorphic datasets (i. e., CIFAR10-DVS and DVS128 Gesture) and comparable Top-1 accuracy on static datasets (i. e., CIFAR-10 and CIFAR-100).
no code implementations • 21 Jun 2023 • Jinye Qu, Zeyu Gao, Tielin Zhang, YanFeng Lu, Huajin Tang, Hong Qiao
We also present a SNN-based ultra-low latency and high accurate object detection model (SUHD) that achieves state-of-the-art performance on nontrivial datasets like PASCAL VOC and MS COCO, with about remarkable 750x fewer timesteps and 30% mean average precision (mAP) improvement, compared to the Spiking-YOLO on MS COCO datasets.
no code implementations • 10 May 2023 • Xiyun Li, Ziyi Ni, Jingqing Ruan, Linghui Meng, Jing Shi, Tielin Zhang, Bo Xu
Inspired by this two-step psychology theory, we propose a biologically plausible mixture of personality (MoP) improved spiking actor network (SAN), whereby a determinantal point process is used to simulate the complex formation and integration of different types of personality in MoP, and dynamic and spiking neurons are incorporated into the SAN for the efficient reinforcement learning.
1 code implementation • 2 Feb 2023 • Minglun Han, Qingyu Wang, Tielin Zhang, Yi Wang, Duzhen Zhang, Bo Xu
The spiking neural network (SNN) using leaky-integrated-and-fire (LIF) neurons has been commonly used in automatic speech recognition (ASR) tasks.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +1
1 code implementation • 29 Dec 2022 • Duzhen Zhang, Tielin Zhang, Shuncheng Jia, Qingyu Wang, Bo Xu
Learning from the interaction is the primary way biological agents know about the environment and themselves.
1 code implementation • 12 Nov 2022 • Shuncheng Jia, Tielin Zhang, Ruichen Zuo, Bo Xu
Here, we propose a Motif-topology improved SNN (M-SNN) for the efficient multi-sensory integration and cognitive phenomenon simulations.
1 code implementation • 11 Feb 2022 • Shuncheng Jia, Ruichen Zuo, Tielin Zhang, Hongxing Liu, Bo Xu
Network architectures and learning principles are key in forming complex functions in artificial neural networks (ANNs) and spiking neural networks (SNNs).
no code implementations • 15 Jun 2021 • Duzhen Zhang, Tielin Zhang, Shuncheng Jia, Xiang Cheng, Bo Xu
Based on a hybrid learning framework, where a spike actor-network infers actions from states and a deep critic network evaluates the actor, we propose a Population-coding and Dynamic-neurons improved Spiking Actor Network (PDSAN) for efficient state representation from two different scales: input coding and neuronal coding.
no code implementations • 23 Oct 2020 • Yinqian Sun, Yi Zeng, Tielin Zhang
Despite advances in artificial intelligence models, neural networks still cannot achieve human performance, partly due to differences in how information is encoded and processed compared to human brain.
1 code implementation • 9 Oct 2020 • Tielin Zhang, Shuncheng Jia, Xiang Cheng, Bo Xu
The performance of the proposed BRP-SNN is further verified on the spatial (including MNIST and Cifar-10) and temporal (including TIDigits and DvsGesture) tasks, where the SNN using BRP has reached a similar accuracy compared to other state-of-the-art BP-based SNNs and saved 50% more computational cost than ANNs.
no code implementations • 7 Oct 2020 • Xiang Cheng, Tielin Zhang, Shuncheng Jia, Bo Xu
Spiking Neural Networks (SNNs) have incorporated more biologically-plausible structures and learning principles, hence are playing critical roles in bridging the gap between artificial and natural neural networks.