Search Results for author: Huajin Tang

Found 31 papers, 6 papers with code

GRSN: Gated Recurrent Spiking Neurons for POMDPs and MARL

no code implementations24 Apr 2024 Lang Qin, ZiMing Wang, Runhao Jiang, Rui Yan, Huajin Tang

Spiking neural networks (SNNs) are widely applied in various fields due to their energy-efficient and fast-inference capabilities.

reinforcement-learning Reinforcement Learning (RL)

EAS-SNN: End-to-End Adaptive Sampling and Representation for Event-based Detection with Recurrent Spiking Neural Networks

no code implementations19 Mar 2024 ZiMing Wang, Ziling Wang, Huaning Li, Lang Qin, Runhao Jiang, De Ma, Huajin Tang

Event cameras, with their high dynamic range and temporal resolution, are ideally suited for object detection, especially under scenarios with motion blur and challenging lighting conditions.

object-detection Object Detection

Enhancing Adaptive History Reserving by Spiking Convolutional Block Attention Module in Recurrent Neural Networks

no code implementations NeurIPS 2023 Qi Xu, Yuyuan Gao, Jiangrong Shen, Yaxin Li, Xuming Ran, Huajin Tang, Gang Pan

Spiking neural networks (SNNs) serve as one type of efficient model to process spatio-temporal patterns in time series, such as the Address-Event Representation data collected from Dynamic Vision Sensor (DVS).

Time Series

Deep Pulse-Coupled Neural Networks

no code implementations24 Dec 2023 Zexiang Yi, Jing Lian, Yunliang Qi, Zhaofei Yu, Huajin Tang, Yide Ma, Jizhao Liu

In this work, we leverage a more biologically plausible neural model with complex dynamics, i. e., a pulse-coupled neural network (PCNN), to improve the expressiveness and recognition performance of SNNs for vision tasks.

Emergence and reconfiguration of modular structure for synaptic neural networks during continual familiarity detection

no code implementations10 Nov 2023 Shi Gu, Marcelo G Mattar, Huajin Tang, Gang Pan

While advances in artificial intelligence and neuroscience have enabled the emergence of neural networks capable of learning a wide variety of tasks, our understanding of the temporal dynamics of these networks remains limited.

Neuromorphic Auditory Perception by Neural Spiketrum

no code implementations11 Sep 2023 Huajin Tang, Pengjie Gu, Jayawan Wijekoon, MHD Anas Alsakkal, ZiMing Wang, Jiangrong Shen, Rui Yan

Neuromorphic computing holds the promise to achieve the energy efficiency and robust learning performance of biological neural systems.

Unleashing the Potential of Spiking Neural Networks for Sequential Modeling with Contextual Embedding

no code implementations29 Aug 2023 Xinyi Chen, Jibin Wu, Huajin Tang, Qinyuan Ren, Kay Chen Tan

The human brain exhibits remarkable abilities in integrating temporally distant sensory inputs for decision-making.

Decision Making

Mitigating Communication Costs in Neural Networks: The Role of Dendritic Nonlinearity

no code implementations21 Jun 2023 Xundong Wu, Pengfei Zhao, Zilin Yu, Lei Ma, Ka-Wa Yip, Huajin Tang, Gang Pan, Tiejun Huang

Our comprehension of biological neuronal networks has profoundly influenced the evolution of artificial neural networks (ANNs).

Temporal Conditioning Spiking Latent Variable Models of the Neural Response to Natural Visual Scenes

no code implementations NeurIPS 2023 Gehua Ma, Runhao Jiang, Rui Yan, Huajin Tang

This work presents the temporal conditioning spiking latent variable models (TeCoS-LVM) to simulate the neural response to natural visual stimuli.

Spiking Neural Network for Ultra-low-latency and High-accurate Object Detection

no code implementations21 Jun 2023 Jinye Qu, Zeyu Gao, Tielin Zhang, YanFeng Lu, Huajin Tang, Hong Qiao

We also present a SNN-based ultra-low latency and high accurate object detection model (SUHD) that achieves state-of-the-art performance on nontrivial datasets like PASCAL VOC and MS COCO, with about remarkable 750x fewer timesteps and 30% mean average precision (mAP) improvement, compared to the Spiking-YOLO on MS COCO datasets.

Object object-detection +1

ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural Networks

no code implementations6 Jun 2023 Jiangrong Shen, Qi Xu, Jian K. Liu, Yueming Wang, Gang Pan, Huajin Tang

To take full advantage of low power consumption and improve the efficiency of these models further, the pruning methods have been explored to find sparse SNNs without redundancy connections after training.

Exploiting Noise as a Resource for Computation and Learning in Spiking Neural Networks

1 code implementation25 May 2023 Gehua Ma, Rui Yan, Huajin Tang

Despite extensive research on spiking neural networks (SNNs), most studies are established on deterministic models, overlooking the inherent non-deterministic, noisy nature of neural computations.

Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks

no code implementations19 Apr 2023 Qi Xu, Yaxin Li, Xuanye Fang, Jiangrong Shen, Jian K. Liu, Huajin Tang, Gang Pan

The proposed method explores a novel dynamical way for structure learning from scratch in SNNs which could build a bridge to close the gap between deep learning and bio-inspired neural dynamics.

Knowledge Distillation

Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation

no code implementations CVPR 2023 Qi Xu, Yaxin Li, Jiangrong Shen, Jian K Liu, Huajin Tang, Gang Pan

Spiking neural networks (SNNs) are well known as the brain-inspired models with high computing efficiency, due to a key component that they utilize spikes as information units, close to the biological neural systems.

Knowledge Distillation

A Low Latency Adaptive Coding Spiking Framework for Deep Reinforcement Learning

1 code implementation21 Nov 2022 Lang Qin, Rui Yan, Huajin Tang

In recent years, spiking neural networks (SNNs) have been used in reinforcement learning (RL) due to their low power consumption and event-driven features.

Offline RL reinforcement-learning +1

Multi-Level Firing with Spiking DS-ResNet: Enabling Better and Deeper Directly-Trained Spiking Neural Networks

1 code implementation12 Oct 2022 Lang Feng, Qianhui Liu, Huajin Tang, De Ma, Gang Pan

Spiking neural networks (SNNs) are bio-inspired neural networks with asynchronous discrete and sparse characteristics, which have increasingly manifested their superiority in low energy consumption.

SPAIC: A Spike-based Artificial Intelligence Computing Framework

1 code implementation26 Jul 2022 Chaofei Hong, Mengwen Yuan, Mengxiao Zhang, Xiao Wang, Chegnjun Zhang, Jiaxin Wang, Gang Pan, Zhaohui Wu, Huajin Tang

In this work, we present a Python based spiking neural network (SNN) simulation and training framework, aka SPAIC that aims to support brain-inspired model and algorithm researches integrated with features from both deep learning and neuroscience.

Towards Lossless ANN-SNN Conversion under Ultra-Low Latency with Dual-Phase Optimization

1 code implementation16 May 2022 ZiMing Wang, Shuang Lian, Yuhao Zhang, Xiaoxin Cui, Rui Yan, Huajin Tang

By evaluating on challenging datasets including CIFAR-10, CIFAR- 100 and ImageNet, the proposed method demonstrates the state-of-the-art performance in terms of accuracy, latency and energy preservation.

object-detection Object Detection +1

Human-Level Control through Directly-Trained Deep Spiking Q-Networks

1 code implementation13 Dec 2021 Guisong Liu, Wenjie Deng, Xiurui Xie, Li Huang, Huajin Tang

Specifically, we propose a directly-trained deep spiking reinforcement learning architecture based on the Leaky Integrate-and-Fire (LIF) neurons and Deep Q-Network (DQN).

Atari Games reinforcement-learning +1

Indoor Lighting Estimation Using an Event Camera

no code implementations CVPR 2021 Zehao Chen, Qian Zheng, Peisong Niu, Huajin Tang, Gang Pan

Image-based methods for indoor lighting estimation suffer from the problem of intensity-distance ambiguity.

Lighting Estimation

Towards Efficient Processing and Learning with Spikes: New Approaches for Multi-Spike Learning

no code implementations2 May 2020 Qiang Yu, Shenglan Li, Huajin Tang, Longbiao Wang, Jianwu Dang, Kay Chen Tan

They are also believed to play an essential role in low-power consumption of the biological systems, whose efficiency attracts increasing attentions to the field of neuromorphic computing.

Effective AER Object Classification Using Segmented Probability-Maximization Learning in Spiking Neural Networks

no code implementations14 Feb 2020 Qianhui Liu, Haibo Ruan, Dong Xing, Huajin Tang, Gang Pan

Address event representation (AER) cameras have recently attracted more attention due to the advantages of high temporal resolution and low power consumption, compared with traditional frame-based cameras.

General Classification Single Particle Analysis

Robust Environmental Sound Recognition with Sparse Key-point Encoding and Efficient Multi-spike Learning

no code implementations4 Feb 2019 Qiang Yu, Yanli Yao, Longbiao Wang, Huajin Tang, Jianwu Dang, Kay Chen Tan

Our framework is a unifying system with a consistent integration of three major functional parts which are sparse encoding, efficient learning and robust readout.

Decision Making

Spiking Deep Residual Network

no code implementations28 Apr 2018 Yangfan Hu, Huajin Tang, Gang Pan

SNNs theoretically have at least the same computational power as traditional artificial neural networks (ANNs).

Connections Between Nuclear Norm and Frobenius Norm Based Representations

no code implementations26 Feb 2015 Xi Peng, Can-Yi Lu, Zhang Yi, Huajin Tang

A lot of works have shown that frobenius-norm based representation (FNR) is competitive to sparse representation and nuclear-norm based representation (NNR) in numerous tasks such as subspace clustering.

Clustering

Fast Low-rank Representation based Spatial Pyramid Matching for Image Classification

no code implementations22 Sep 2014 Xi Peng, Rui Yan, Bo Zhao, Huajin Tang, Zhang Yi

Although the methods achieve a higher recognition rate than the traditional SPM, they consume more time to encode the local descriptors extracted from the image.

General Classification Image Classification +1

A Unified Framework for Representation-based Subspace Clustering of Out-of-sample and Large-scale Data

no code implementations25 Sep 2013 Xi Peng, Huajin Tang, Lei Zhang, Zhang Yi, Shijie Xiao

In this paper, we propose a unified framework which makes representation-based subspace clustering algorithms feasible to cluster both out-of-sample and large-scale data.

Clustering

Constructing the L2-Graph for Robust Subspace Learning and Subspace Clustering

no code implementations5 Sep 2012 Xi Peng, Zhiding Yu, Huajin Tang, Zhang Yi

Under the framework of graph-based learning, the key to robust subspace clustering and subspace learning is to obtain a good similarity graph that eliminates the effects of errors and retains only connections between the data points from the same subspace (i. e., intra-subspace data points).

Clustering Image Clustering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.