Search Results for author: Rongguang Ye

Found 8 papers, 6 papers with code

PraFFL: A Preference-Aware Scheme in Fair Federated Learning

no code implementations13 Apr 2024 Rongguang Ye, Ming Tang

Fairness in federated learning has emerged as a critical concern, aiming to develop an unbiased model for any special group (e. g., male or female) of sensitive features.

Fairness Federated Learning

Data-Driven Preference Sampling for Pareto Front Learning

no code implementations12 Apr 2024 Rongguang Ye, Lei Chen, Weiduo Liao, Jinyuan Zhang, Hisao Ishibuchi

In this manner, the proposed method can sample preference vectors from the location of the Pareto front with a high probability.

Evolutionary Preference Sampling for Pareto Set Learning

2 code implementations12 Apr 2024 Rongguang Ye, Longcan Chen, Jinyuan Zhang, Hisao Ishibuchi

Recently, Pareto Set Learning (PSL) has been proposed for learning the entire Pareto set using a neural network.

Evolutionary Algorithms

Collaborative Pareto Set Learning in Multiple Multi-Objective Optimization Problems

1 code implementation1 Apr 2024 Chikai Shang, Rongguang Ye, Jiaqi Jiang, Fangqing Gu

In this paper, we propose a Collaborative Pareto Set Learning (CoPSL) framework, which simultaneously learns the Pareto sets of multiple MOPs in a collaborative manner.

Localization Distillation for Object Detection

1 code implementation12 Apr 2022 Zhaohui Zheng, Rongguang Ye, Qibin Hou, Dongwei Ren, Ping Wang, WangMeng Zuo, Ming-Ming Cheng

Combining these two new components, for the first time, we show that logit mimicking can outperform feature imitation and the absence of localization distillation is a critical reason for why logit mimicking underperforms for years.

Knowledge Distillation Object +2

Localization Distillation for Dense Object Detection

2 code implementations CVPR 2022 Zhaohui Zheng, Rongguang Ye, Ping Wang, Dongwei Ren, WangMeng Zuo, Qibin Hou, Ming-Ming Cheng

Previous KD methods for object detection mostly focus on imitating deep features within the imitation regions instead of mimicking classification logit due to its inefficiency in distilling localization information and trivial improvement.

Dense Object Detection Knowledge Distillation +2

Enhancing Geometric Factors in Model Learning and Inference for Object Detection and Instance Segmentation

6 code implementations7 May 2020 Zhaohui Zheng, Ping Wang, Dongwei Ren, Wei Liu, Rongguang Ye, QinGhua Hu, WangMeng Zuo

In this paper, we propose Complete-IoU (CIoU) loss and Cluster-NMS for enhancing geometric factors in both bounding box regression and Non-Maximum Suppression (NMS), leading to notable gains of average precision (AP) and average recall (AR), without the sacrifice of inference efficiency.

Clustering Instance Segmentation +6

Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression

20 code implementations19 Nov 2019 Zhaohui Zheng, Ping Wang, Wei Liu, Jinze Li, Rongguang Ye, Dongwei Ren

By incorporating DIoU and CIoU losses into state-of-the-art object detection algorithms, e. g., YOLO v3, SSD and Faster RCNN, we achieve notable performance gains in terms of not only IoU metric but also GIoU metric.

object-detection Object Detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.