no code implementations • 13 Feb 2025 • Chenxiang Ma, Xinyi Chen, Yanchen Li, Qu Yang, Yujie Wu, Guoqi Li, Gang Pan, Huajin Tang, Kay Chen Tan, Jibin Wu
Temporal processing is fundamental for both biological and artificial intelligence systems, as it enables the comprehension of dynamic environments and facilitates timely responses.
no code implementations • 17 Jan 2025 • Chen Zhang, Xinyi Dai, Yaxiong Wu, Qu Yang, Yasheng Wang, Ruiming Tang, Yong liu
Multi-turn interaction in the dialogue system research refers to a system's ability to maintain context across multiple dialogue turns, enabling it to generate coherent and contextually relevant responses.
1 code implementation • 7 Oct 2024 • Xiang Hao, Chenxiang Ma, Qu Yang, Jibin Wu, Kay Chen Tan
In recent years, deep learning-based methods have significantly improved speech enhancement performance, but they often come with a high computational cost, which is prohibitive for a large number of edge devices, such as headsets and hearing aids.
1 code implementation • 22 Jul 2024 • Zeyu Wang, Jingyu Lin, Yifei Qian, Yi Huang, Shicen Tian, Bosong Chai, Juncan Deng, Qu Yang, Lan Du, Cunjian Chen, Kejie Huang
However, most diffusion models are limited to visible RGB image generation.
1 code implementation • 24 Jun 2024 • Qu Yang, Mang Ye, Bo Du
Experimental results demonstrate that EmoLLM significantly elevates multimodal emotional understanding performance, with an average improvement of 12. 1% across multiple foundation models on EmoBench.
1 code implementation • 14 Jun 2024 • Zeyang Song, Qianhui Liu, Qu Yang, Yizhou Peng, Haizhou Li
Keyword Spotting (KWS) is essential in edge computing requiring rapid and energy-efficient responses.
no code implementations • 9 Mar 2024 • Qu Yang, Qianhui Liu, Nan Li, Meng Ge, Zeyang Song, Haizhou Li
Spiking Neural Networks (SNNs) are known to be biologically plausible and power-efficient.
no code implementations • 23 Oct 2023 • Qu Yang, Malu Zhang, Jibin Wu, Kay Chen Tan, Haizhou Li
With TTFS coding, we can achieve up to orders of magnitude saving in computation over ANN and other rate-based SNNs.
1 code implementation • 25 Aug 2023 • Shimin Zhang, Qu Yang, Chenxiang Ma, Jibin Wu, Haizhou Li, Kay Chen Tan
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
no code implementations • 14 Jul 2023 • Shimin Zhang, Qu Yang, Chenxiang Ma, Jibin Wu, Haizhou Li, Kay Chen Tan
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
1 code implementation • 26 May 2023 • Xinyi Chen, Qu Yang, Jibin Wu, Haizhou Li, Kay Chen Tan
As an initial exploration in this direction, we propose a hybrid neural coding and learning framework, which encompasses a neural coding zoo with diverse neural coding schemes discovered in neuroscience.
1 code implementation • 10 Oct 2022 • Qu Yang, Jibin Wu, Malu Zhang, Yansong Chua, Xinchao Wang, Haizhou Li
The LTL rule follows the teacher-student learning approach by mimicking the intermediate feature representations of a pre-trained ANN.
no code implementations • 15 Feb 2019 • Jibin Wu, Yansong Chua, Malu Zhang, Qu Yang, Guoqi Li, Haizhou Li
Deep spiking neural networks (SNNs) support asynchronous event-driven computation, massive parallelism and demonstrate great potential to improve the energy efficiency of its synchronous analog counterpart.