1 code implementation • 6 Oct 2024 • Yige Xu, Xu Guo, Zhiwei Zeng, Chunyan Miao
Large language models (LLMs) have brought a great breakthrough to the natural language processing (NLP) community, while leading the challenge of handling concurrent customer queries due to their high throughput demands.
no code implementations • 11 Aug 2024 • Xiaoxiong Zhang, Zhiwei Zeng, Xin Zhou, Zhiqi Shen
During client-side local training, FedKD facilitates the low-dimensional student model to mimic the score distribution of triples from the high-dimensional teacher model using KL divergence loss.
1 code implementation • 4 Jul 2024 • Yongjie Wang, Xiaoqi Qiu, Yu Yue, Xu Guo, Zhiwei Zeng, Yuhong Feng, Zhiqi Shen
Natural language counterfactual generation aims to minimally modify a given text such that the modified text will be classified into a different class.
no code implementations • 19 Jun 2024 • Xiaoxiong Zhang, Zhiwei Zeng, Xin Zhou, Dusit Niyato, Zhiqi Shen
Federated Knowledge Graphs Embedding learning (FKGE) encounters challenges in communication efficiency stemming from the considerable size of parameters and extensive communication rounds.
no code implementations • 17 Jun 2024 • Xiaoxiong Zhang, Zhiwei Zeng, Xin Zhou, Dusit Niyato, Zhiqi Shen
To address this, we propose Personalized Federated knowledge graph Embedding with client-wise relation Graph (PFedEG), a novel approach that employs a client-wise relation graph to learn personalized embeddings by discerning the semantic relevance of embeddings from other clients.
1 code implementation • 9 Jun 2024 • Xiaoqi Qiu, Yongjie Wang, Xu Guo, Zhiwei Zeng, Yue Yu, Yuhong Feng, Chunyan Miao
Counterfactually Augmented Data (CAD) involves creating new data samples by applying minimal yet sufficient modifications to flip the label of existing data samples to other classes.
no code implementations • 16 Feb 2024 • Lingzi Zhang, Xin Zhou, Zhiwei Zeng, Zhiqi Shen
Recent sequential recommendation models have combined pre-trained text embeddings of items with item ID embeddings to achieve superior recommendation performance.
no code implementations • 18 Jan 2024 • He Zhao, Zhiwei Zeng, Yongwei Wang, Deheng Ye, Chunyan Miao
Heterogeneous Graph Neural Networks (HGNNs) are increasingly recognized for their performance in areas like the web and e-commerce, where resilience against adversarial attacks is crucial.
no code implementations • 23 Oct 2023 • Yige Xu, Zhiwei Zeng, Zhiqi Shen
Emotion Recognition in Conversation (ERC) has been widely studied due to its importance in developing emotion-aware empathetic machines.
Computational Efficiency
Emotion Recognition in Conversation
no code implementations • 21 Mar 2023 • Lingzi Zhang, Xin Zhou, Zhiwei Zeng, Zhiqi Shen
We propose a novel Multimodal Pre-training for Sequential Recommendation (MP4SR) framework, which utilizes contrastive losses to capture the correlation among different modality sequences of users, as well as the correlation among different modality sequences of users and items.
2 code implementations • 9 Feb 2023 • HongYu Zhou, Xin Zhou, Zhiwei Zeng, Lingzi Zhang, Zhiqi Shen
Recommendation systems have become popular and effective tools to help users discover their interesting items by modeling the user preference and item property based on implicit interactions (e. g., purchasing and clicking).
no code implementations • 2 Feb 2023 • Tong Zhang, Yong liu, Boyang Li, Zhiwei Zeng, Pengwei Wang, Yuan You, Chunyan Miao, Lizhen Cui
HAHT maintains a long-term memory of history conversations and utilizes history information to understand current conversation context and generate well-informed and context-relevant responses.
2 code implementations • 13 Jul 2022 • Xin Zhou, HongYu Zhou, Yong liu, Zhiwei Zeng, Chunyan Miao, Pengwei Wang, Yuan You, Feijun Jiang
Besides the user-item interaction graph, existing state-of-the-art methods usually use auxiliary graphs (e. g., user-user or item-item relation graph) to augment the learned representations of users and/or items.
no code implementations • 23 Jan 2016 • Zhiwei Zeng
With higher level of abstraction, the reusability of the quantitative model is also improved.