Search Results for author: Charles Ling

Found 14 papers, 7 papers with code

MABR: A Multilayer Adversarial Bias Removal Approach Without Prior Bias Knowledge

no code implementations10 Aug 2024 Maxwell J. Yin, Boyu Wang, Charles Ling

Models trained on real-world data often mirror and exacerbate existing social biases.

Attribute

Intersectional Unfairness Discovery

1 code implementation31 May 2024 Gezheng Xu, Qi Chen, Charles Ling, Boyu Wang, Changjian Shui

To further evaluate the generated unseen but possible unfair intersectional sensitive attributes, we formulate them as prompts and use modern generative AI to produce new texts and images.

Attribute Fairness

Toward Open-ended Embodied Tasks Solving

no code implementations10 Dec 2023 William Wei Wang, Dongqi Han, Xufang Luo, Yifei Shen, Charles Ling, Boyu Wang, Dongsheng Li

Empowering embodied agents, such as robots, with Artificial Intelligence (AI) has become increasingly important in recent years.

Secure and Fast Asynchronous Vertical Federated Learning via Cascaded Hybrid Optimization

no code implementations28 Jun 2023 Ganyu Wang, Qingsong Zhang, Li Xiang, Boyu Wang, Bin Gu, Charles Ling

Meanwhile, the upstream model (server) is updated with first-order optimization (FOO) locally, which significantly improves the convergence rate, making it feasible to train the large models without compromising privacy and security.

Privacy Preserving Vertical Federated Learning

Class Overwhelms: Mutual Conditional Blended-Target Domain Adaptation

1 code implementation3 Feb 2023 Pengcheng Xu, Boyu Wang, Charles Ling

We demonstrate that domain labels are not directly necessary for BTDA if categorical distributions of various domains are sufficiently aligned even facing the imbalance of domains and the label distribution shift of classes.

Blended-target Domain Adaptation Label shift of blended-target domain adaptation +1

When Source-Free Domain Adaptation Meets Learning with Noisy Labels

no code implementations31 Jan 2023 Li Yi, Gezheng Xu, Pengcheng Xu, Jiaqi Li, Ruizhi Pu, Charles Ling, A. Ian McLeod, Boyu Wang

We also prove that such a difference makes existing LLN methods that rely on their distribution assumptions unable to address the label noise in SFDA.

Learning with noisy labels Source-Free Domain Adaptation

Foresee What You Will Learn: Data Augmentation for Domain Generalization in Non-stationary Environment

1 code implementation19 Jan 2023 Qiuhao Zeng, Wei Wang, Fan Zhou, Charles Ling, Boyu Wang

In this paper, we formulate such problems as Evolving Domain Generalization, where a model aims to generalize well on a target domain by discovering and leveraging the evolving pattern of the environment.

Data Augmentation Evolving Domain Generalization +1

Dynamically Instance-Guided Adaptation: A Backward-Free Approach for Test-Time Domain Adaptive Semantic Segmentation

1 code implementation CVPR 2023 Wei Wang, Zhun Zhong, Weijie Wang, Xi Chen, Charles Ling, Boyu Wang, Nicu Sebe

In this paper, we study the application of Test-time domain adaptation in semantic segmentation (TTDA-Seg) where both efficiency and effectiveness are crucial.

Domain Adaptation Semantic Segmentation

On Learning Fairness and Accuracy on Multiple Subgroups

1 code implementation19 Oct 2022 Changjian Shui, Gezheng Xu, Qi Chen, Jiaqi Li, Charles Ling, Tal Arbel, Boyu Wang, Christian Gagné

In the upper-level, the fair predictor is updated to be close to all subgroup specific predictors.

Fairness

Evolving Domain Generalization

no code implementations31 May 2022 William Wei Wang, Gezheng Xu, Ruizhi Pu, Jiaqi Li, Fan Zhou, Changjian Shui, Charles Ling, Christian Gagné, Boyu Wang

Domain generalization aims to learn a predictive model from multiple different but related source tasks that can generalize well to a target task without the need of accessing any target data.

Evolving Domain Generalization Meta-Learning

Directional Domain Generalization

no code implementations29 Sep 2021 Wei Wang, Jiaqi Li, Ruizhi Pu, Gezheng Xu, Fan Zhou, Changjian Shui, Charles Ling, Boyu Wang

Domain generalization aims to learn a predictive model from multiple different but related source tasks that can generalize well to a target task without the need of accessing any target data.

Domain Generalization Meta-Learning +1

Aggregating From Multiple Target-Shifted Sources

1 code implementation9 May 2021 Changjian Shui, Zijian Li, Jiaqi Li, Christian Gagné, Charles Ling, Boyu Wang

Multi-source domain adaptation aims at leveraging the knowledge from multiple tasks for predicting a related target domain.

Unsupervised Domain Adaptation

Unified Principles For Multi-Source Transfer Learning Under Label Shifts

no code implementations1 Jan 2021 Changjian Shui, Zijian Li, Jiaqi Li, Christian Gagné, Charles Ling, Boyu Wang

We study the label shift problem in multi-source transfer learning and derive new generic principles to control the target generalization risk.

Transfer Learning Unsupervised Domain Adaptation

Pelee: A Real-Time Object Detection System on Mobile Devices

2 code implementations NeurIPS 2018 Jun Wang, Tanner Bohn, Charles Ling

In this study, we propose an efficient architecture named PeleeNet, which is built with conventional convolution instead.

object-detection Real-Time Object Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.