Search Results for author: Weimin Zhang

Found 6 papers, 2 papers with code

Model Editing for LLMs4Code: How Far are We?

1 code implementation11 Nov 2024 Xiaopeng Li, Shangwen Wang, Shasha Li, Jun Ma, Jie Yu, Xiaodong Liu, Jing Wang, Bin Ji, Weimin Zhang

Despite that, a comprehensive study that thoroughly compares and analyzes the performance of the state-of-the-art model editing techniques for adapting the knowledge within LLMs4Code across various code-related tasks is notably absent.

16k Code Generation +6

SWEA: Updating Factual Knowledge in Large Language Models via Subject Word Embedding Altering

1 code implementation31 Jan 2024 Xiaopeng Li, Shasha Li, Shezheng Song, Huijun Liu, Bin Ji, Xi Wang, Jun Ma, Jie Yu, Xiaodong Liu, Jing Wang, Weimin Zhang

In particular, local editing methods, which directly update model parameters, are more suitable for updating a small amount of knowledge.

Model Editing Word Embeddings

How to Bridge the Gap between Modalities: Survey on Multimodal Large Language Model

no code implementations10 Nov 2023 Shezheng Song, Xiaopeng Li, Shasha Li, Shan Zhao, Jie Yu, Jun Ma, Xiaoguang Mao, Weimin Zhang

We explore Multimodal Large Language Models (MLLMs), which integrate LLMs like GPT-4 to handle multimodal data, including text, images, audio, and more.

Image Captioning Language Modeling +3

Correlative Preference Transfer with Hierarchical Hypergraph Network for Multi-Domain Recommendation

no code implementations21 Nov 2022 Zixuan Xu, Penghui Wei, Shaoguo Liu, Weimin Zhang, Liang Wang, Bo Zheng

Conventional graph neural network based methods usually deal with each domain separately, or train a shared model to serve all domains.

Graph Neural Network Marketing

UKD: Debiasing Conversion Rate Estimation via Uncertainty-regularized Knowledge Distillation

no code implementations20 Jan 2022 Zixuan Xu, Penghui Wei, Weimin Zhang, Shaoguo Liu, Liang Wang, Bo Zheng

Then a student model is trained on both clicked and unclicked ads with knowledge distillation, performing uncertainty modeling to alleviate the inherent noise in pseudo-labels.

Knowledge Distillation Selection bias

Cannot find the paper you are looking for? You can Submit a new open access paper.