Search Results for author: Yujie Wu

Found 15 papers, 6 papers with code

Spatio-Temporal Backpropagation for Training High-performance Spiking Neural Networks

1 code implementation8 Jun 2017 Yujie Wu, Lei Deng, Guoqi Li, Jun Zhu, Luping Shi

By simultaneously considering the layer-by-layer spatial domain (SD) and the timing-dependent temporal domain (TD) in the training phase, as well as an approximated derivative for the spike activity, we propose a spatio-temporal backpropagation (STBP) training framework without using any complicated technology.

object-detection Object Detection +1

Direct Training for Spiking Neural Networks: Faster, Larger, Better

no code implementations16 Sep 2018 Yujie Wu, Lei Deng, Guoqi Li, Jun Zhu, Luping Shi

Spiking neural networks (SNNs) that enables energy efficient implementation on emerging neuromorphic hardware are gaining more attention.

DashNet: A Hybrid Artificial and Spiking Neural Network for High-speed Object Tracking

no code implementations15 Sep 2019 Zheyu Yang, Yujie Wu, Guanrui Wang, Yukuan Yang, Guoqi Li, Lei Deng, Jun Zhu, Luping Shi

To the best of our knowledge, DashNet is the first framework that can integrate and process ANNs and SNNs in a hybrid paradigm, which provides a novel solution to achieve both effectiveness and efficiency for high-speed object tracking.

Object Tracking Open-Ended Question Answering

Comprehensive SNN Compression Using ADMM Optimization and Activity Regularization

1 code implementation3 Nov 2019 Lei Deng, Yujie Wu, Yifan Hu, Ling Liang, Guoqi Li, Xing Hu, Yufei Ding, Peng Li, Yuan Xie

As well known, the huge memory and compute costs of both artificial neural networks (ANNs) and spiking neural networks (SNNs) greatly hinder their deployment on edge devices with high efficiency.

Model Compression Quantization

Exploring Adversarial Attack in Spiking Neural Networks with Spike-Compatible Gradient

no code implementations1 Jan 2020 Ling Liang, Xing Hu, Lei Deng, Yujie Wu, Guoqi Li, Yufei Ding, Peng Li, Yuan Xie

Recently, backpropagation through time inspired learning algorithms are widely introduced into SNNs to improve the performance, which brings the possibility to attack the models accurately given Spatio-temporal gradient maps.

Adversarial Attack

Comparing SNNs and RNNs on Neuromorphic Vision Datasets: Similarities and Differences

1 code implementation2 May 2020 Weihua He, Yujie Wu, Lei Deng, Guoqi Li, Haoyu Wang, Yang Tian, Wei Ding, Wenhui Wang, Yuan Xie

Neuromorphic data, recording frameless spike events, have attracted considerable attention for the spatiotemporal information components and the event-driven processing fashion.

Fairness Gesture Recognition

Brain-inspired global-local learning incorporated with neuromorphic computing

no code implementations5 Jun 2020 Yujie Wu, Rong Zhao, Jun Zhu, Feng Chen, Mingkun Xu, Guoqi Li, Sen Song, Lei Deng, Guanrui Wang, Hao Zheng, Jing Pei, Youhui Zhang, Mingguo Zhao, Luping Shi

We demonstrate the advantages of this model in multiple different tasks, including few-shot learning, continual learning, and fault-tolerance learning in neuromorphic vision sensors.

Continual Learning Few-Shot Learning

Going Deeper With Directly-Trained Larger Spiking Neural Networks

2 code implementations29 Oct 2020 Hanle Zheng, Yujie Wu, Lei Deng, Yifan Hu, Guoqi Li

To this end, we propose a threshold-dependent batch normalization (tdBN) method based on the emerging spatio-temporal backpropagation, termed "STBP-tdBN", enabling direct training of a very deep SNN and the efficient implementation of its inference on neuromorphic hardware.

Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization in Graph Learning

no code implementations30 Jun 2021 Mingkun Xu, Yujie Wu, Lei Deng, Faqiang Liu, Guoqi Li, Jing Pei

Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain for processing multimodal information in complex environments.

Graph Attention Graph Learning +1

H2Learn: High-Efficiency Learning Accelerator for High-Accuracy Spiking Neural Networks

no code implementations25 Jul 2021 Ling Liang, Zheng Qu, Zhaodong Chen, Fengbin Tu, Yujie Wu, Lei Deng, Guoqi Li, Peng Li, Yuan Xie

Although spiking neural networks (SNNs) take benefits from the bio-plausible neural modeling, the low accuracy under the common local synaptic plasticity learning rules limits their application in many practical tasks.

Vocal Bursts Intensity Prediction

Advancing Deep Residual Learning by Solving the Crux of Degradation in Spiking Neural Networks

no code implementations9 Dec 2021 Yifan Hu, Yujie Wu, Lei Deng, Guoqi Li

In this paper, we identify the crux and then propose a novel residual block for SNNs, which is able to significantly extend the depth of directly trained SNNs, e. g., up to 482 layers on CIFAR-10 and 104 layers on ImageNet, without observing any slight degradation problem.

Advancing Spiking Neural Networks towards Deep Residual Learning

1 code implementation15 Dec 2021 Yifan Hu, Lei Deng, Yujie Wu, Man Yao, Guoqi Li

Despite the rapid progress of neuromorphic computing, inadequate capacity and insufficient representation power of spiking neural networks (SNNs) severely restrict their application scope in practice.

Hierarchical Vector Quantized Transformer for Multi-class Unsupervised Anomaly Detection

1 code implementation NeurIPS 2023 Ruiying Lu, Yujie Wu, Long Tian, Dongsheng Wang, Bo Chen, Xiyang Liu, Ruimin Hu

First, instead of learning the continuous representations, we preserve the typical normal patterns as discrete iconic prototypes, and confirm the importance of Vector Quantization in preventing the model from falling into the shortcut.

Quantization Unsupervised Anomaly Detection

Multi-source domain adaptation for regression

no code implementations9 Dec 2023 Yujie Wu, Giovanni Parmigiani, Boyu Ren

First, we extend a flexible single-source DA algorithm for classification through outcome-coarsening to enable its application to regression problems.

Domain Adaptation Ensemble Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.