Search Results for author: Lan-Zhe Guo

Found 9 papers, 2 papers with code

Investigating the Limitation of CLIP Models: The Worst-Performing Categories

no code implementations5 Oct 2023 Jie-Jing Shao, Jiang-Xin Shi, Xiao-Wen Yang, Lan-Zhe Guo, Yu-Feng Li

Contrastive Language-Image Pre-training (CLIP) provides a foundation model by integrating natural language into visual concepts, enabling zero-shot recognition on downstream tasks.

Prompt Engineering Zero-Shot Learning

LAMDA-SSL: Semi-Supervised Learning in Python

1 code implementation9 Aug 2022 Lin-Han Jia, Lan-Zhe Guo, Zhi Zhou, Yu-Feng Li

The second part shows the usage of LAMDA-SSL by abundant examples in detail.

Transfer and Share: Semi-Supervised Learning from Long-Tailed Data

no code implementations26 May 2022 Tong Wei, Qian-Yu Liu, Jiang-Xin Shi, Wei-Wei Tu, Lan-Zhe Guo

TRAS transforms the imbalanced pseudo-label distribution of a traditional SSL model via a delicate function to enhance the supervisory signals for minority classes.

Pseudo Label Representation Learning

Robust Deep Semi-Supervised Learning: A Brief Introduction

no code implementations12 Feb 2022 Lan-Zhe Guo, Zhi Zhou, Yu-Feng Li

Semi-supervised learning (SSL) is the branch of machine learning that aims to improve learning performance by leveraging unlabeled data when labels are insufficient.

STEP: Out-of-Distribution Detection in the Presence of Limited In-Distribution Labeled Data

no code implementations NeurIPS 2021 Zhi Zhou, Lan-Zhe Guo, Zhanzhan Cheng, Yu-Feng Li, ShiLiang Pu

However, in many real-world applications, it is desirable to have SSL algorithms that not only classify the samples drawn from the same distribution of labeled data but also detect out-of-distribution (OOD) samples drawn from an unknown distribution.

Out-of-Distribution Detection Out of Distribution (OOD) Detection

Weakly Supervised Learning Meets Ride-Sharing User Experience Enhancement

no code implementations19 Jan 2020 Lan-Zhe Guo, Feng Kuang, Zhang-Xun Liu, Yu-Feng Li, Nan Ma, Xiao-Hu Qie

For example, in user experience enhancement from Didi, one of the largest online ride-sharing platforms, the ride comment data contains severe label noise (due to the subjective factors of passengers) and severe label distribution bias (due to the sampling bias).

Weakly-supervised Learning

Reliable Weakly Supervised Learning: Maximize Gain and Maintain Safeness

no code implementations22 Apr 2019 Lan-Zhe Guo, Yu-Feng Li, Ming Li, Jin-Feng Yi, Bo-Wen Zhou, Zhi-Hua Zhou

We guide the optimization of label quality through a small amount of validation data, and to ensure the safeness of performance while maximizing performance gain.

Weakly-supervised Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.