Search Results for author: Shukai Duan

Found 7 papers, 3 papers with code

Temporal Information Reconstruction and Non-Aligned Residual in Spiking Neural Networks for Speech Classification

no code implementations31 Dec 2024 Qi Zhang, Huamin Wang, Hangchi Shen, Shukai Duan, Shiping Wen, TingWen Huang

Recently, it can be noticed that most models based on spiking neural networks (SNNs) only use a same level temporal resolution to deal with speech classification problems, which makes these models cannot learn the information of input data at different temporal scales.

NeuroMoCo: A Neuromorphic Momentum Contrast Learning Method for Spiking Neural Networks

no code implementations10 Jun 2024 Yuqi Ma, Huamin Wang, Hangchi Shen, Xuemei Chen, Shukai Duan, Shiping Wen

Recently, brain-inspired spiking neural networks (SNNs) have attracted great research attention owing to their inherent bio-interpretability, event-triggered properties and powerful perception of spatiotemporal information, which is beneficial to handling event-based neuromorphic datasets.

Contrastive Learning Self-Supervised Learning

A Structure-Aware Framework for Learning Device Placements on Computation Graphs

1 code implementation23 May 2024 Shukai Duan, Heng Ping, Nikos Kanakaris, Xiongye Xiao, Panagiotis Kyriakis, Nesreen K. Ahmed, Peiyu Zhang, Guixiang Ma, Mihai Capota, Shahin Nazarian, Theodore L. Willke, Paul Bogdan

Computation graphs are Directed Acyclic Graphs (DAGs) where the nodes correspond to mathematical operations and are used widely as abstractions in optimizations of neural networks.

graph partitioning Graph Representation Learning +1

Leveraging Reinforcement Learning and Large Language Models for Code Optimization

no code implementations9 Dec 2023 Shukai Duan, Nikos Kanakaris, Xiongye Xiao, Heng Ping, Chenyu Zhou, Nesreen K. Ahmed, Guixiang Ma, Mihai Capota, Theodore L. Willke, Shahin Nazarian, Paul Bogdan

We compare our framework with existing state-of-the-art models and show that it is more efficient with respect to speed and computational usage, as a result of the decrement in training steps and its applicability to models with fewer parameters.

Language Modelling reinforcement-learning +2

Network Pruning via Feature Shift Minimization

1 code implementation6 Jul 2022 Yuanzhi Duan, Yue Zhou, Peng He, Qiang Liu, Shukai Duan, Xiaofang Hu

In this paper, we propose a novel Feature Shift Minimization (FSM) method to compress CNN models, which evaluates the feature shift by converging the information of both features and filters.

Network Pruning

Network Compression via Central Filter

1 code implementation10 Dec 2021 Yuanzhi Duan, Xiaofang Hu, Yue Zhou, Qiang Liu, Shukai Duan

In this paper, by exploring the similarities between feature maps, we propose a novel filter pruning method, Central Filter (CF), which suggests that a filter is approximately equal to a set of other filters after appropriate adjustments.

Network Pruning

TDACNN: Target-domain-free Domain Adaptation Convolutional Neural Network for Drift Compensation in Gas Sensors

no code implementations14 Oct 2021 Yuelin Zhang, Sihao Xiang, Zehuan Wang, Xiaoyan Peng, Yutong Tian, Shukai Duan, Jia Yan

Sensor drift is a long-existing unpredictable problem that deteriorates the performance of gaseous substance recognition, calling for an antidrift domain adaptation algorithm.

Domain Adaptation

Cannot find the paper you are looking for? You can Submit a new open access paper.