Search Results for author: Dongcheng Zhao

Found 40 papers, 8 papers with code

PandaGuard: Systematic Evaluation of LLM Safety against Jailbreaking Attacks

1 code implementation20 May 2025 Guobin Shen, Dongcheng Zhao, Linghao Feng, Xiang He, Jihang Wang, Sicheng Shen, Haibo Tong, Yiting Dong, Jindong Li, Xiang Zheng, Yi Zeng

Large language models (LLMs) have achieved remarkable capabilities but remain vulnerable to adversarial prompts known as jailbreaks, which can bypass safety alignment and elicit harmful outputs.

LLM Jailbreak Safety Alignment

STEP: A Unified Spiking Transformer Evaluation Platform for Fair and Reproducible Benchmarking

1 code implementation16 May 2025 Sicheng Shen, Dongcheng Zhao, Linghao Feng, Zeyang Yue, Jindong Li, Tenglong Li, Guobin Shen, Yi Zeng

Spiking Transformers have recently emerged as promising architectures for combining the efficiency of spiking neural networks with the representational power of self-attention.

Benchmarking

Incorporating brain-inspired mechanisms for multimodal learning in artificial intelligence

1 code implementation15 May 2025 Xiang He, Dongcheng Zhao, Yang Li, Qingqun Kong, Xin Yang, Yi Zeng

Inspired by this biological mechanism, we explore the relationship between multimodal output and information from individual modalities, proposing an inverse effectiveness driven multimodal fusion (IEMF) strategy.

Computational Efficiency Continual Learning +1

Biologically Inspired Spiking Diffusion Model with Adaptive Lateral Selection Mechanism

no code implementations31 Mar 2025 Linghao Feng, Dongcheng Zhao, Sicheng Shen, Yi Zeng

We leverage this spiking inner loop alongside a lateral connection mechanism to iteratively refine the substructure selection network, enhancing model adaptability and expressivity.

Enhancing Audio-Visual Spiking Neural Networks through Semantic-Alignment and Cross-Modal Residual Learning

1 code implementation18 Feb 2025 Xiang He, Dongcheng Zhao, Yiting Dong, Guobin Shen, Xin Yang, Yi Zeng

However, existing SNN models primarily focus on unimodal processing and lack efficient cross-modal information fusion, thereby limiting their effectiveness in real-world multimodal scenarios.

$SpikePack$: Enhanced Information Flow in Spiking Neural Networks with High Hardware Compatibility

no code implementations24 Jan 2025 Guobin Shen, Jindong Li, Tenglong Li, Dongcheng Zhao, Yi Zeng

${SpikePack}$ achieves constant $\mathcal{O}(1)$ time and space complexity, enabling efficient parallel processing on GPUs and also supporting serial inference on existing SNN hardware accelerators.

Computational Efficiency image-classification +1

Harnessing Task Overload for Scalable Jailbreak Attacks on Large Language Models

no code implementations5 Oct 2024 Yiting Dong, Guobin Shen, Dongcheng Zhao, Xiang He, Yi Zeng

Existing attack methods are fixed or specifically tailored for certain models and cannot flexibly adjust attack strength, which is critical for generalization when attacking models of various sizes.

Prompt Engineering

Jailbreak Antidote: Runtime Safety-Utility Balance via Sparse Representation Adjustment in Large Language Models

no code implementations3 Oct 2024 Guobin Shen, Dongcheng Zhao, Yiting Dong, Xiang He, Yi Zeng

In this paper, we introduce Jailbreak Antidote, a method that enables real-time adjustment of LLM safety preferences by manipulating a sparse subset of the model's internal states during inference.

Prompt Engineering

StressPrompt: Does Stress Impact Large Language Models and Human Performance Similarly?

no code implementations14 Sep 2024 Guobin Shen, Dongcheng Zhao, Aorigele Bao, Xiang He, Yiting Dong, Yi Zeng

Moreover, this study contributes to the broader AI research community by offering a new perspective on how LLMs handle different scenarios and their similarities to human cognition.

Emotional Intelligence Instruction Following

Brain-Inspired Stepwise Patch Merging for Vision Transformers

no code implementations11 Sep 2024 Yonghao Yu, Dongcheng Zhao, Guobin Shen, Yiting Dong, Yi Zeng

The hierarchical architecture has become a mainstream design paradigm for Vision Transformers (ViTs), with Patch Merging serving as the pivotal component that transforms a columnar architecture into a hierarchical one.

object-detection Object Detection +1

CACE-Net: Co-guidance Attention and Contrastive Enhancement for Effective Audio-Visual Event Localization

1 code implementation4 Aug 2024 Xiang He, Xiangxi Liu, Yang Li, Dongcheng Zhao, Guobin Shen, Qingqun Kong, Xin Yang, Yi Zeng

Specifically, we have enhanced the model's ability to discern subtle differences between event and background and improved the accuracy of event classification in our model.

audio-visual event localization

Directly Training Temporal Spiking Neural Network with Sparse Surrogate Gradient

no code implementations28 Jun 2024 Yang Li, Feifei Zhao, Dongcheng Zhao, Yi Zeng

Brain-inspired Spiking Neural Networks (SNNs) have attracted much attention due to their event-based computing and energy-efficient features.

Time Cell Inspired Temporal Codebook in Spiking Neural Networks for Enhanced Image Generation

no code implementations23 May 2024 Linghao Feng, Dongcheng Zhao, Sicheng Shen, Yiting Dong, Guobin Shen, Yi Zeng

This paper presents a novel approach leveraging Spiking Neural Networks (SNNs) to construct a Variational Quantized Autoencoder (VQ-VAE) with a temporal codebook inspired by hippocampal time cells.

Image Generation

Neuro-Vision to Language: Enhancing Brain Recording-based Visual Reconstruction and Language Interaction

no code implementations30 Apr 2024 Guobin Shen, Dongcheng Zhao, Xiang He, Linghao Feng, Yiting Dong, Jihang Wang, Qian Zhang, Yi Zeng

Decoding non-invasive brain recordings is pivotal for advancing our understanding of human cognition but faces challenges due to individual differences and complex neural signal representations.

Brain Decoding Image Reconstruction +1

TIM: An Efficient Temporal Interaction Module for Spiking Transformer

2 code implementations22 Jan 2024 Sicheng Shen, Dongcheng Zhao, Guobin Shen, Yi Zeng

Spiking Neural Networks (SNNs), as the third generation of neural networks, have gained prominence for their biological plausibility and computational efficiency, especially in processing diverse datasets.

Computational Efficiency Image Classification

Are Conventional SNNs Really Efficient? A Perspective from Network Quantization

no code implementations CVPR 2024 Guobin Shen, Dongcheng Zhao, Tenglong Li, Jindong Li, Yi Zeng

This paper introduces a unified perspective illustrating that the time steps in SNNs and quantized bit-widths of activation values present analogous representations.

Fairness Quantization

Is Conventional SNN Really Efficient? A Perspective from Network Quantization

no code implementations17 Nov 2023 Guobin Shen, Dongcheng Zhao, Tenglong Li, Jindong Li, Yi Zeng

This paper introduces a unified perspective, illustrating that the time steps in SNNs and quantized bit-widths of activation values present analogous representations.

Fairness Quantization

FireFly v2: Advancing Hardware Support for High-Performance Spiking Neural Network with a Spatiotemporal FPGA Accelerator

no code implementations28 Sep 2023 Jindong Li, Guobin Shen, Dongcheng Zhao, Qian Zhang, Yi Zeng

As a further step in supporting high-performance SNNs on specialized hardware, we introduce FireFly v2, an FPGA SNN accelerator that can address the issue of non-spike operation in current SOTA SNN algorithms, which presents an obstacle in the end-to-end deployment onto existing SNN hardware.

Learning the Plasticity: Plasticity-Driven Learning Framework in Spiking Neural Networks

no code implementations23 Aug 2023 Guobin Shen, Dongcheng Zhao, Yiting Dong, Yang Li, Feifei Zhao, Yi Zeng

This shift in focus from weight adjustment to mastering the intricacies of synaptic change offers a more flexible and dynamic pathway for neural networks to evolve and adapt.

Improving Stability and Performance of Spiking Neural Networks through Enhancing Temporal Consistency

no code implementations23 May 2023 Dongcheng Zhao, Guobin Shen, Yiting Dong, Yang Li, Yi Zeng

Notably, our algorithm has achieved state-of-the-art performance on neuromorphic datasets DVS-CIFAR10 and N-Caltech101, and can achieve superior performance in the test phase with timestep T=1.

Dive into the Power of Neuronal Heterogeneity

no code implementations19 May 2023 Guobin Shen, Dongcheng Zhao, Yiting Dong, Yang Li, Yi Zeng

The biological neural network is a vast and diverse structure with high neural heterogeneity.

continuous-control Continuous Control

Spiking Generative Adversarial Network with Attention Scoring Decoding

no code implementations17 May 2023 Linghao Feng, Dongcheng Zhao, Yi Zeng

As it stands, such models are primarily limited to the domain of artificial neural networks.

Generative Adversarial Network

Temporal Knowledge Sharing enable Spiking Neural Network Learning from Past and Future

no code implementations13 Apr 2023 Yiting Dong, Dongcheng Zhao, Yi Zeng

However, SNNs typically grapple with challenges such as extended time steps, low temporal information utilization, and the requirement for consistent time step between testing and training.

MSAT: Biologically Inspired Multi-Stage Adaptive Threshold for Conversion of Spiking Neural Networks

no code implementations23 Mar 2023 Xiang He, Yang Li, Dongcheng Zhao, Qingqun Kong, Yi Zeng

The self-adaptation to membrane potential and input allows a timely adjustment of the threshold to fire spike faster and transmit more information.

Sentiment Analysis Sentiment Classification +2

An Efficient Knowledge Transfer Strategy for Spiking Neural Networks from Static to Event Domain

1 code implementation23 Mar 2023 Xiang He, Dongcheng Zhao, Yang Li, Guobin Shen, Qingqun Kong, Yi Zeng

In order to improve the generalization ability of SNNs on event-based datasets, we use static images to assist SNN training on event data.

Transfer Learning

Exploiting High Performance Spiking Neural Networks with Efficient Spiking Patterns

no code implementations29 Jan 2023 Guobin Shen, Dongcheng Zhao, Yi Zeng

Inspired by spike patterns in biological neurons, this paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance from the perspective of network information capacity.

Vocal Bursts Intensity Prediction

FireFly: A High-Throughput Hardware Accelerator for Spiking Neural Networks with Efficient DSP and Memory Optimization

no code implementations5 Jan 2023 Jindong Li, Guobin Shen, Dongcheng Zhao, Qian Zhang, Yi Zeng

To improve memory efficiency, we design a memory system to enable efficient synaptic weights and membrane voltage memory access with reasonable on-chip RAM consumption.

BrainCog: A Spiking Neural Network based Brain-inspired Cognitive Intelligence Engine for Brain-inspired AI and Brain Simulation

no code implementations18 Jul 2022 Yi Zeng, Dongcheng Zhao, Feifei Zhao, Guobin Shen, Yiting Dong, Enmeng Lu, Qian Zhang, Yinqian Sun, Qian Liang, Yuxuan Zhao, Zhuoya Zhao, Hongjian Fang, Yuwei Wang, Yang Li, Xin Liu, Chengcheng Du, Qingqun Kong, Zizhe Ruan, Weida Bi

These brain-inspired AI models have been effectively validated on various supervised, unsupervised, and reinforcement learning tasks, and they can be used to enable AI models to be with multiple brain-inspired cognitive functions.

Decision Making

An Unsupervised STDP-based Spiking Neural Network Inspired By Biologically Plausible Learning Rules and Connections

no code implementations6 Jul 2022 Yiting Dong, Dongcheng Zhao, Yang Li, Yi Zeng

By integrating the above three adaptive mechanisms and STB-STDP, our model greatly accelerates the training of unsupervised spiking neural networks and improves the performance of unsupervised SNNs on complex tasks.

DPSNN: A Differentially Private Spiking Neural Network with Temporal Enhanced Pooling

no code implementations24 May 2022 Jihang Wang, Dongcheng Zhao, Guobin Shen, Qian Zhang, Yi Zeng

Privacy protection is a crucial issue in machine learning algorithms, and the current privacy protection is combined with traditional artificial neural networks based on real values.

Face Recognition Image Classification +5

EventMix: An Efficient Augmentation Strategy for Event-Based Data

no code implementations24 May 2022 Guobin Shen, Dongcheng Zhao, Yi Zeng

Data augmentation can improve the quantity and quality of the original data by processing more representations from the original data.

Data Augmentation

N-Omniglot, a large-scale neuromorphic dataset for spatio-temporal sparse few-shot learning

1 code implementation25 Dec 2021 Yang Li, Yiting Dong, Dongcheng Zhao, Yi Zeng

Few-shot learning (learning with a few samples) is one of the most important cognitive abilities of the human brain.

Few-Shot Learning

Spiking CapsNet: A Spiking Neural Network With A Biologically Plausible Routing Rule Between Capsules

no code implementations15 Nov 2021 Dongcheng Zhao, Yang Li, Yi Zeng, Jihang Wang, Qian Zhang

Our Spiking CapsNet fully combines the strengthens of SNN and CapsNet, and shows strong robustness to noise and affine transformation.

Backpropagation with Biologically Plausible Spatio-Temporal Adjustment For Training Deep Spiking Neural Networks

no code implementations17 Oct 2021 Guobin Shen, Dongcheng Zhao, Yi Zeng

Secondly, we propose a biologically plausible temporal adjustment making the error propagate across the spikes in the temporal dimension, which overcomes the problem of the temporal dependency within a single spike period of the traditional spiking neurons.

BSNN: Towards Faster and Better Conversion of Artificial Neural Networks to Spiking Neural Networks with Bistable Neurons

no code implementations27 May 2021 Yang Li, Yi Zeng, Dongcheng Zhao

Also, when ResNet structure-based ANNs are converted, the information of output neurons is incomplete due to the rapid transmission of the shortcut path.

BackEISNN: A Deep Spiking Neural Network with Adaptive Self-Feedback and Balanced Excitatory-Inhibitory Neurons

no code implementations27 May 2021 Dongcheng Zhao, Yi Zeng, Yang Li

With the combination of the two mechanisms, we propose a deep spiking neural network with adaptive self-feedback and balanced excitatory and inhibitory neurons (BackEISNN).

Cannot find the paper you are looking for? You can Submit a new open access paper.