Search Results for author: Daoyuan Chen

Found 21 papers, 11 papers with code

Dynamic Demonstration Retrieval and Cognitive Understanding for Emotional Support Conversation

no code implementations3 Apr 2024 Zhe Xu, Daoyuan Chen, Jiayi Kuang, Zihao Yi, Yaliang Li, Ying Shen

Emotional Support Conversation (ESC) systems are pivotal in providing empathetic interactions, aiding users through negative emotional states by understanding and addressing their unique experiences.

Empathetic Response Generation In-Context Learning +2

AgentScope: A Flexible yet Robust Multi-Agent Platform

1 code implementation21 Feb 2024 Dawei Gao, Zitao Li, Weirui Kuang, Xuchen Pan, Daoyuan Chen, Zhijian Ma, Bingchen Qian, Liuyi Yao, Lin Zhu, Chen Cheng, Hongzhu Shi, Yaliang Li, Bolin Ding, Jingren Zhou

With the rapid advancement of Large Language Models (LLMs), significant progress has been made in multi-agent applications.

On the Convergence of Zeroth-Order Federated Tuning for Large Language Models

no code implementations8 Feb 2024 Zhenqing Ling, Daoyuan Chen, Liuyi Yao, Yaliang Li, Ying Shen

The confluence of Federated Learning (FL) and Large Language Models (LLMs) is ushering in a new era in privacy-preserving natural language processing.

Federated Learning Privacy Preserving

Enhancing Multimodal Large Language Models with Vision Detection Models: An Empirical Study

no code implementations31 Jan 2024 Qirui Jiao, Daoyuan Chen, Yilun Huang, Yaliang Li, Ying Shen

Despite the impressive capabilities of Multimodal Large Language Models (MLLMs) in integrating text and image modalities, challenges remain in accurately interpreting detailed visual elements.

Hallucination object-detection +3

FederatedScope-LLM: A Comprehensive Package for Fine-tuning Large Language Models in Federated Learning

1 code implementation1 Sep 2023 Weirui Kuang, Bingchen Qian, Zitao Li, Daoyuan Chen, Dawei Gao, Xuchen Pan, Yuexiang Xie, Yaliang Li, Bolin Ding, Jingren Zhou

When several entities have similar interested tasks, but their data cannot be shared because of privacy concerns regulations, federated learning (FL) is a mainstream solution to leverage the data of different entities.

Benchmarking Federated Learning +1

Efficient Personalized Federated Learning via Sparse Model-Adaptation

2 code implementations4 May 2023 Daoyuan Chen, Liuyi Yao, Dawei Gao, Bolin Ding, Yaliang Li

To overcome these challenges, we propose a novel approach named pFedGate for efficient personalized FL by adaptively and efficiently learning sparse local models.

Personalized Federated Learning

FS-Real: Towards Real-World Cross-Device Federated Learning

no code implementations23 Mar 2023 Daoyuan Chen, Dawei Gao, Yuexiang Xie, Xuchen Pan, Zitao Li, Yaliang Li, Bolin Ding, Jingren Zhou

Federated Learning (FL) aims to train high-quality models in collaboration with distributed clients while not uploading their local data, which attracts increasing attention in both academia and industry.

Federated Learning

Revisiting Personalized Federated Learning: Robustness Against Backdoor Attacks

1 code implementation3 Feb 2023 Zeyu Qin, Liuyi Yao, Daoyuan Chen, Yaliang Li, Bolin Ding, Minhao Cheng

We conduct the first study of backdoor attacks in the pFL framework, testing 4 widely used backdoor attacks against 6 pFL methods on benchmark datasets FEMNIST and CIFAR-10, a total of 600 experiments.

Backdoor Attack Personalized Federated Learning

pFL-Bench: A Comprehensive Benchmark for Personalized Federated Learning

1 code implementation8 Jun 2022 Daoyuan Chen, Dawei Gao, Weirui Kuang, Yaliang Li, Bolin Ding

Personalized Federated Learning (pFL), which utilizes and deploys distinct local models, has gained increasing attention in recent years due to its success in handling the statistical heterogeneity of FL clients.

Fairness Personalized Federated Learning

A Benchmark for Federated Hetero-Task Learning

1 code implementation7 Jun 2022 Liuyi Yao, Dawei Gao, Zhen Wang, Yuexiang Xie, Weirui Kuang, Daoyuan Chen, Haohui Wang, Chenhe Dong, Bolin Ding, Yaliang Li

To investigate the heterogeneity in federated learning in real-world scenarios, we generalize the classic federated learning to federated hetero-task learning, which emphasizes the inconsistency across the participants in federated learning in terms of both data distribution and learning tasks.

Federated Learning Meta-Learning +2

FederatedScope: A Flexible Federated Learning Platform for Heterogeneity

1 code implementation11 Apr 2022 Yuexiang Xie, Zhen Wang, Dawei Gao, Daoyuan Chen, Liuyi Yao, Weirui Kuang, Yaliang Li, Bolin Ding, Jingren Zhou

Although remarkable progress has been made by existing federated learning (FL) platforms to provide infrastructures for development, these platforms may not well tackle the challenges brought by various types of heterogeneity, including the heterogeneity in participants' local data, resources, behaviors and learning goals.

Federated Learning Hyperparameter Optimization

Learned Index with Dynamic $\epsilon$

no code implementations29 Sep 2021 Daoyuan Chen, Wuchao Li, Yaliang Li, Bolin Ding, Kai Zeng, Defu Lian, Jingren Zhou

We theoretically analyze prediction error bounds that link $\epsilon$ with data characteristics for an illustrative learned index method.

Retrieval

A Pluggable Learned Index Method via Sampling and Gap Insertion

no code implementations4 Jan 2021 Yaliang Li, Daoyuan Chen, Bolin Ding, Kai Zeng, Jingren Zhou

In this paper, we propose a formal machine learning based framework to quantify the index learning objective, and study two general and pluggable techniques to enhance the learning efficiency and learning effectiveness for learned indexes.

BIG-bench Machine Learning Retrieval

Relabel the Noise: Joint Extraction of Entities and Relations via Cooperative Multiagents

no code implementations ACL 2020 Daoyuan Chen, Yaliang Li, Kai Lei, Ying Shen

Distant supervision based methods for entity and relation extraction have received increasing popularity due to the fact that these methods require light human annotation efforts.

Relation Relation Extraction

AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search

1 code implementation13 Jan 2020 Daoyuan Chen, Yaliang Li, Minghui Qiu, Zhen Wang, Bofang Li, Bolin Ding, Hongbo Deng, Jun Huang, Wei. Lin, Jingren Zhou

Motivated by the necessity and benefits of task-oriented BERT compression, we propose a novel compression method, AdaBERT, that leverages differentiable Neural Architecture Search to automatically compress BERT into task-adaptive small models for specific tasks.

Knowledge Distillation Neural Architecture Search

Cooperative Denoising for Distantly Supervised Relation Extraction

no code implementations COLING 2018 Kai Lei, Daoyuan Chen, Yaliang Li, Nan Du, Min Yang, Wei Fan, Ying Shen

Distantly supervised relation extraction greatly reduces human efforts in extracting relational facts from unstructured texts.

Denoising Information Retrieval +4

Cannot find the paper you are looking for? You can Submit a new open access paper.