no code implementations • 15 Nov 2023 • Boxun Xu, Hejia Geng, Yuxuan Yin, Peng Li
We introduce DISTA, a Denoising Spiking Transformer with Intrinsic Plasticity and SpatioTemporal Attention, designed to maximize the spatiotemporal computational prowess of spiking neurons, particularly for vision applications.
no code implementations • 30 Sep 2023 • Hejia Geng, Boxun Xu, Peng Li
Large Language Models (LLMs) have demonstrated impressive inferential capabilities, with numerous research endeavors devoted to enhancing this capacity through prompting.
no code implementations • 20 Aug 2023 • Hejia Geng, Peng Li
Spiking neural networks (SNNs) offer promise for efficient and powerful neurally inspired computation.
no code implementations • 4 Aug 2021 • Wenrui Zhang, Hejia Geng, Peng Li
The small size of the motifs and sparse inter-motif connectivity leads to an RSNN architecture scalable to large network sizes.