no code implementations • 11 Jul 2024 • Ding Chen, Chen Liu
We present a novel approach called differentially private stochastic block coordinate descent (DP-SBCD) for training neural networks with provable guarantees of differential privacy under the hidden state assumption.
1 code implementation • 30 Jun 2024 • Yanfang Chen, Ding Chen, Shichao Song, Simin Niu, Hanyu Wang, Zeyun Tang, Feiyu Xiong, Zhiyu Li
HealthRCN is the largest known dataset of Chinese health information rumors to date.
1 code implementation • 20 May 2024 • Qingchen Yu, Zifan Zheng, Shichao Song, Zhiyu Li, Feiyu Xiong, Bo Tang, Ding Chen
The continuous advancement of large language models (LLMs) has brought increasing attention to the critical issue of developing fair and reliable methods for evaluating their performance.
no code implementations • 7 Mar 2024 • Ding Chen, Peixi Peng, Tiejun Huang, Yonghong Tian
As a general method for exploration in deep reinforcement learning (RL), NoisyNet can produce problem-specific exploration strategies.
no code implementations • 9 Jan 2024 • Ding Chen, Peixi Peng, Tiejun Huang, Yonghong Tian
Recently, the surrogate gradient method has been utilized for training multi-layer SNNs, which allows SNNs to achieve comparable performance with the corresponding deep networks in this task.
1 code implementation • 7 Jan 2024 • Ding Chen, Shichao Song, Qingchen Yu, Zhiyu Li, Wenjin Wang, Feiyu Xiong, Bo Tang
In this paper, we propose a method SLEICL that involves learning from examples using strong language models and then summarizing and transferring these learned skills to weak language models for inference and application.
1 code implementation • 25 Oct 2023 • Wei Fang, Yanqi Chen, Jianhao Ding, Zhaofei Yu, Timothée Masquelier, Ding Chen, Liwei Huang, Huihui Zhou, Guoqi Li, Yonghong Tian
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties.
1 code implementation • NeurIPS 2023 • Wei Fang, Zhaofei Yu, Zhaokun Zhou, Ding Chen, Yanqi Chen, Zhengyu Ma, Timothée Masquelier, Yonghong Tian
Vanilla spiking neurons in Spiking Neural Networks (SNNs) use charge-fire-reset neuronal dynamics, which can only be simulated serially and can hardly learn long-time dependencies.
no code implementations • 30 Jun 2022 • Lei Zhao, Ding Chen, Daoli Zhu, Xiao Li
For the case when $f$ is weakly convex and its subdifferential satisfies the global metric subregularity property, we derive the $\mathcal{O}(\varepsilon^{-4})$ iteration complexity in expectation.
no code implementations • 21 Jan 2022 • Ding Chen, Peixi Peng, Tiejun Huang, Yonghong Tian
With the help of special neuromorphic hardware, spiking neural networks (SNNs) are expected to realize artificial intelligence (AI) with less energy consumption.
no code implementations • 26 Jul 2021 • Carol Alexander, Ding Chen, Arben Imeraj
Over 90% of exchange trading on crypto options has always been on the Deribit platform.