Search Results for author: Feiyang Ye

Found 10 papers, 2 papers with code

Task-Aware Low-Rank Adaptation of Segment Anything Model

no code implementations16 Mar 2024 Xuehao Wang, Feiyang Ye, Yu Zhang

Furthermore, we introduce modified SAM (mSAM) for multi-task learning where we remove the prompt encoder of SAM and use task-specific no mask embeddings and mask decoder for each task.

Image Segmentation Multi-Task Learning +2

A First-Order Multi-Gradient Algorithm for Multi-Objective Bi-Level Optimization

no code implementations17 Jan 2024 Feiyang Ye, Baijiong Lin, Xiaofeng Cao, Yu Zhang, Ivor Tsang

In this paper, we study the Multi-Objective Bi-Level Optimization (MOBLO) problem, where the upper-level subproblem is a multi-objective optimization problem and the lower-level subproblem is for scalar optimization.

Multi-Task Learning

A Unified Framework for Unsupervised Domain Adaptation based on Instance Weighting

no code implementations8 Dec 2023 Jinjing Zhu, Feiyang Ye, Qiao Xiao, Pengxin Guo, Yu Zhang, Qiang Yang

Specifically, the proposed LIWUDA method constructs a weight network to assign weights to each instance based on its probability of belonging to common classes, and designs Weighted Optimal Transport (WOT) for domain alignment by leveraging instance weights.

Partial Domain Adaptation Universal Domain Adaptation +1

FedLPA: Personalized One-shot Federated Learning with Layer-Wise Posterior Aggregation

no code implementations30 Sep 2023 Xiang Liu, Liangxi Liu, Feiyang Ye, Yunheng Shen, Xia Li, Linshan Jiang, Jialin Li

Efficiently aggregating trained neural networks from local clients into a global model on a server is a widely researched topic in federated learning.

Federated Learning

Dual-Balancing for Multi-Task Learning

1 code implementation23 Aug 2023 Baijiong Lin, Weisen Jiang, Feiyang Ye, Yu Zhang, Pengguang Chen, Ying-Cong Chen, Shu Liu, James T. Kwok

Multi-task learning (MTL), a learning paradigm to learn multiple related tasks simultaneously, has achieved great success in various fields.

Multi-Task Learning

A Fast Attention Network for Joint Intent Detection and Slot Filling on Edge Devices

no code implementations16 May 2022 Liang Huang, Senjie Liang, Feiyang Ye, Nan Gao

In this paper, we propose a Fast Attention Network (FAN) for joint intent detection and slot filling tasks, guaranteeing both accuracy and latency.

Intent Detection Natural Language Understanding +3

Deep Safe Multi-Task Learning

no code implementations20 Nov 2021 Zhixiong Yue, Feiyang Ye, Yu Zhang, Christy Liang, Ivor W. Tsang

We theoretically study the safeness of both learning strategies in the DSMTL model to show that the proposed methods can achieve some versions of safe multi-task learning.

Multi-Task Learning

Multi-Objective Meta Learning

no code implementations NeurIPS 2021 Feiyang Ye, Baijiong Lin, Zhixiong Yue, Pengxin Guo, Qiao Xiao, Yu Zhang

Empirically, we show the effectiveness of the proposed MOML framework in several meta learning problems, including few-shot learning, neural architecture search, domain adaptation, and multi-task learning.

Domain Adaptation Few-Shot Learning +2

Cannot find the paper you are looking for? You can Submit a new open access paper.