Search Results for author: Baijiong Lin

Found 9 papers, 3 papers with code

A First-Order Multi-Gradient Algorithm for Multi-Objective Bi-Level Optimization

no code implementations17 Jan 2024 Feiyang Ye, Baijiong Lin, Xiaofeng Cao, Yu Zhang, Ivor Tsang

In this paper, we study the Multi-Objective Bi-Level Optimization (MOBLO) problem, where the upper-level subproblem is a multi-objective optimization problem and the lower-level subproblem is for scalar optimization.

Multi-Task Learning

BYOM: Building Your Own Multi-Task Model For Free

no code implementations3 Oct 2023 Weisen Jiang, Baijiong Lin, Han Shi, Yu Zhang, Zhenguo Li, James T. Kwok

Recently, various merging methods have been proposed to build a multi-task model from task-specific finetuned models without retraining.

Efficient Transfer Learning in Diffusion Models via Adversarial Noise

no code implementations23 Aug 2023 Xiyu Wang, Baijiong Lin, Daochang Liu, Chang Xu

Diffusion Probabilistic Models (DPMs) have demonstrated substantial promise in image generation tasks but heavily rely on the availability of large amounts of training data.

Denoising Image Generation +1

Dual-Balancing for Multi-Task Learning

1 code implementation23 Aug 2023 Baijiong Lin, Weisen Jiang, Feiyang Ye, Yu Zhang, Pengguang Chen, Ying-Cong Chen, Shu Liu, James T. Kwok

Multi-task learning (MTL), a learning paradigm to learn multiple related tasks simultaneously, has achieved great success in various fields.

Multi-Task Learning

LibMTL: A Python Library for Multi-Task Learning

1 code implementation27 Mar 2022 Baijiong Lin, Yu Zhang

This paper presents LibMTL, an open-source Python library built on PyTorch, which provides a unified, comprehensive, reproducible, and extensible implementation framework for Multi-Task Learning (MTL).

Multi-Task Learning

Multi-Objective Meta Learning

no code implementations NeurIPS 2021 Feiyang Ye, Baijiong Lin, Zhixiong Yue, Pengxin Guo, Qiao Xiao, Yu Zhang

Empirically, we show the effectiveness of the proposed MOML framework in several meta learning problems, including few-shot learning, neural architecture search, domain adaptation, and multi-task learning.

Domain Adaptation Few-Shot Learning +2

Multi-Task Adversarial Attack

no code implementations19 Nov 2020 Pengxin Guo, Yuancheng Xu, Baijiong Lin, Yu Zhang

More specifically, MTA uses a generator for adversarial perturbations which consists of a shared encoder for all tasks and multiple task-specific decoders.

Adversarial Attack

Effective, Efficient and Robust Neural Architecture Search

no code implementations19 Nov 2020 Zhixiong Yue, Baijiong Lin, Xiaonan Huang, Yu Zhang

Although NAS methods can find network architectures with the state-of-the-art performance, the adversarial robustness and resource constraint are often ignored in NAS.

Adversarial Robustness Neural Architecture Search

Cannot find the paper you are looking for? You can Submit a new open access paper.