Search Results for author: Jun Shu

Found 18 papers, 5 papers with code

Variational Label Enhancement

no code implementations ICML 2020 Ning Xu, Yun-Peng Liu, Jun Shu, Xin Geng

Label distribution covers a certain number of labels, representing the degree to which each label describes the instance.

Multi-Label Learning Variational Inference

Are Dense Labels Always Necessary for 3D Object Detection from Point Cloud?

no code implementations5 Mar 2024 Chenqiang Gao, Chuandong Liu, Jun Shu, Fangcen Liu, Jiang Liu, Luyu Yang, Xinbo Gao, Deyu Meng

Current state-of-the-art (SOTA) 3D object detection methods often require a large amount of 3D bounding box annotations for training.

3D Object Detection object-detection +1

DAC-MR: Data Augmentation Consistency Based Meta-Regularization for Meta-Learning

1 code implementation13 May 2023 Jun Shu, Xiang Yuan, Deyu Meng, Zongben Xu

Besides, meta-data-driven meta-loss objective combined with DAC-MR is capable of achieving better meta-level generalization.

Data Augmentation Meta-Learning

Improve Noise Tolerance of Robust Loss via Noise-Awareness

no code implementations18 Jan 2023 Kehui Ding, Jun Shu, Deyu Meng, Zongben Xu

To achieve setting such instance-dependent hyperparameters for robust loss, we propose a meta-learning method capable of adaptively learning a hyperparameter prediction function, called Noise-Aware-Robust-Loss-Adjuster (NARL-Adjuster).

Meta-Learning

A Hyper-weight Network for Hyperspectral Image Denoising

no code implementations9 Dec 2022 Xiangyu Rui, Xiangyong Cao, Jun Shu, Qian Zhao, Deyu Meng

Extensive experiments verify that the proposed HWnet can help improve the generalization ability of a weighted model to adapt to more complex noise, and can also strengthen the weighted model by transferring the knowledge from another weighted model.

Hyperspectral Image Denoising Image Denoising

CMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep Learning

1 code implementation11 Feb 2022 Jun Shu, Xiang Yuan, Deyu Meng, Zongben Xu

Specifically, by seeing each training class as a separate learning task, our method aims to extract an explicit weighting function with sample loss and task/class feature as input, and sample weight as output, expecting to impose adaptively varying weighting schemes to different sample classes based on their own intrinsic bias characteristics.

Image Classification Partial Label Learning

Select-ProtoNet: Learning to Select for Few-Shot Disease Subtype Prediction

no code implementations2 Sep 2020 Ziyi Yang, Jun Shu, Yong Liang, Deyu Meng, Zongben Xu

Current machine learning has made great progress on computer vision and many other fields attributed to the large amount of high-quality training samples, while it does not work very well on genomic data analysis, since they are notoriously known as small data.

feature selection Few-Shot Image Classification +1

Meta Feature Modulator for Long-tailed Recognition

no code implementations8 Aug 2020 Renzhen Wang, Kaiqin Hu, Yanwen Zhu, Jun Shu, Qian Zhao, Deyu Meng

We further design a modulator network to guide the generation of the modulation parameters, and such a meta-learner can be readily adapted to train the classification network on other long-tailed datasets.

General Classification Meta-Learning +1

Learning to Purify Noisy Labels via Meta Soft Label Corrector

1 code implementation3 Aug 2020 Yichen Wu, Jun Shu, Qi Xie, Qian Zhao, Deyu Meng

By viewing the label correction procedure as a meta-process and using a meta-learner to automatically correct labels, we could adaptively obtain rectified soft labels iteratively according to current training problems without manually preset hyper-parameters.

Meta-Learning

MLR-SNet: Transferable LR Schedules for Heterogeneous Tasks

no code implementations29 Jul 2020 Jun Shu, Yanwen Zhu, Qian Zhao, Zongben Xu, Deyu Meng

Meanwhile, it always needs to search proper LR schedules from scratch for new tasks, which, however, are often largely different with task variations, like data modalities, network architectures, or training data capacities.

text-classification Text Classification

Meta Transition Adaptation for Robust Deep Learning with Noisy Labels

no code implementations10 Jun 2020 Jun Shu, Qian Zhao, Zongben Xu, Deyu Meng

To discover intrinsic inter-class transition probabilities underlying data, learning with noise transition has become an important approach for robust deep learning on corrupted labels.

Learning with noisy labels

Learning Adaptive Loss for Robust Learning with Noisy Labels

no code implementations16 Feb 2020 Jun Shu, Qian Zhao, Keyu Chen, Zongben Xu, Deyu Meng

Four kinds of SOTA robust loss functions are attempted to be integrated into our algorithm, and comprehensive experiments substantiate the general availability and effectiveness of the proposed method in both its accuracy and generalization performance, as compared with conventional hyperparameter tuning strategy, even with carefully tuned hyperparameters.

Learning with noisy labels Meta-Learning

Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting

3 code implementations NeurIPS 2019 Jun Shu, Qi Xie, Lixuan Yi, Qian Zhao, Sanping Zhou, Zongben Xu, Deyu Meng

Current deep neural networks (DNNs) can easily overfit to biased training data with corrupted labels or class imbalance.

Ranked #24 on Image Classification on Clothing1M (using extra training data)

Image Classification Meta-Learning

Small Sample Learning in Big Data Era

no code implementations14 Aug 2018 Jun Shu, Zongben Xu, Deyu Meng

This category mainly focuses on learning with insufficient samples, and can also be called small data learning in some literatures.

Small Data Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.