Search Results for author: Lingfeng Yang

Found 7 papers, 6 papers with code

Fine-Grained Visual Prompting

1 code implementation NeurIPS 2023 Lingfeng Yang, Yueze Wang, Xiang Li, Xinlong Wang, Jian Yang

Previous works have suggested that incorporating visual prompts, such as colorful boxes or circles, can improve the ability of models to recognize objects of interest.

Visual Prompting

A Survey of Historical Learning: Learning Models with Learning History

1 code implementation23 Mar 2023 Xiang Li, Ge Wu, Lingfeng Yang, Wenhai Wang, RenJie Song, Jian Yang

The various types of elements, deposited in the training history, are a large amount of wealth for improving learning deep models.

Ensemble Learning

Curriculum Temperature for Knowledge Distillation

1 code implementation29 Nov 2022 Zheng Li, Xiang Li, Lingfeng Yang, Borui Zhao, RenJie Song, Lei Luo, Jun Li, Jian Yang

In this paper, we propose a simple curriculum-based technique, termed Curriculum Temperature for Knowledge Distillation (CTKD), which controls the task difficulty level during the student's learning career through a dynamic and learnable temperature.

Image Classification Knowledge Distillation

Uniform Masking: Enabling MAE Pre-training for Pyramid-based Vision Transformers with Locality

1 code implementation20 May 2022 Xiang Li, Wenhai Wang, Lingfeng Yang, Jian Yang

Masked AutoEncoder (MAE) has recently led the trends of visual self-supervision area by an elegant asymmetric encoder-decoder design, which significantly optimizes both the pre-training efficiency and fine-tuning accuracy.

Object Detection

RecursiveMix: Mixed Learning with History

1 code implementation14 Mar 2022 Lingfeng Yang, Xiang Li, Borui Zhao, RenJie Song, Jian Yang

In semantic segmentation, RM also surpasses the baseline and CutMix by 1. 9 and 1. 1 mIoU points under UperNet on ADE20K, respectively.

object-detection Object Detection +1

Dynamic MLP for Fine-Grained Image Classification by Leveraging Geographical and Temporal Information

1 code implementation CVPR 2022 Lingfeng Yang, Xiang Li, RenJie Song, Borui Zhao, Juntian Tao, Shihao Zhou, Jiajun Liang, Jian Yang

Therefore, it is helpful to leverage additional information, e. g., the locations and dates for data shooting, which can be easily accessible but rarely exploited.

Fine-Grained Image Classification

Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation

no code implementations1 Oct 2021 Zheng Li, Xiang Li, Lingfeng Yang, Jian Yang, Zhigeng Pan

Knowledge distillation usually transfers the knowledge from a pre-trained cumbersome teacher network to a compact student network, which follows the classical teacher-teaching-student paradigm.

Self-Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.