no code implementations • 7 Apr 2024 • Yukun Yang, Naihao Wang, Haixin Yang, Ruirui Li
Based on CSR, this study designs a joint sample selection strategy and constructs a comprehensive and powerful learning framework called CSR+.
no code implementations • 9 Nov 2023 • Yukun Yang
This report explores the theory that explains the high sparsity phenomenon \citep{tosato2023emergent} observed in the forward-forward algorithm \citep{hinton2022forward}.
no code implementations • 1 Dec 2022 • Yukun Yang, Peng Li
Gradient-based first-order adaptive optimization methods such as the Adam optimizer are prevalent in training artificial networks, achieving the state-of-the-art results.
no code implementations • 15 May 2022 • Yukun Yang, Peng Li
We employ the Hebbian rule operating in local compartments to update synaptic weights and achieve supervised learning in a biologically plausible manner.
1 code implementation • 29 Mar 2022 • Jiarun Liu, Daguang Jiang, Yukun Yang, Ruirui Li
The state-of-the-art learning with noisy label method Co-teaching and Co-teaching+ confronts the noisy label by mutual-information between dual-network.
no code implementations • 14 Nov 2021 • Yukun Yang, Peng Li
Our experiments show that the proposed framework demonstrates learning accuracy comparable to BP-based rules and may provide new insights on how learning is orchestrated in biological systems.
no code implementations • 29 Sep 2021 • Yukun Yang, Peng Li
There exists a marked cleavage between the biological plausible approaches and the practical backpropagation-based approaches on how to train a deep spiking neural network (DSNN) with better performance.
no code implementations • 22 Jun 2021 • Yukun Yang, Wenrui Zhang, Peng Li
While backpropagation (BP) has been applied to spiking neural networks (SNNs) achieving encouraging results, a key challenge involved is to backpropagate a continuous-valued loss over layers of spiking neurons exhibiting discontinuous all-or-none firing activities.
no code implementations • 18 Nov 2020 • Yukun Yang
Spiking neural networks (SNN) are usually more energy-efficient as compared to Artificial neural networks (ANN), and the way they work has a great similarity with our brain.
1 code implementation • NeurIPS 2019 • Ximing Qiao, Yukun Yang, Hai Li
An original trigger used by an attacker to build the backdoored model represents only a point in the space.
1 code implementation • 19 Jun 2019 • Hsin-Pai Cheng, Tunhou Zhang, Yukun Yang, Feng Yan, Shi-Yu Li, Harris Teague, Hai Li, Yiran Chen
Designing neural architectures for edge devices is subject to constraints of accuracy, inference latency, and computational cost.