Search Results for author: Yingjun Du

Found 11 papers, 5 papers with code

Training-Free Semantic Segmentation via LLM-Supervision

no code implementations31 Mar 2024 Wenfang Sun, Yingjun Du, Gaowen Liu, Ramana Kompella, Cees G. M. Snoek

Additionally, we propose an assembly that merges the segmentation maps from the various subclass descriptors to ensure a more comprehensive representation of the different aspects in the test images.

Language Modelling Large Language Model +4

ProtoDiff: Learning to Learn Prototypical Networks by Task-Guided Diffusion

1 code implementation NeurIPS 2023 Yingjun Du, Zehao Xiao, Shengcai Liao, Cees Snoek

Furthermore, we introduce a task-guided diffusion process within the prototype space, enabling the meta-learning of a generative process that transitions from a vanilla prototype to an overfitted prototype.

Few-Shot Learning

Multi-Label Meta Weighting for Long-Tailed Dynamic Scene Graph Generation

1 code implementation16 Jun 2023 Shuo Chen, Yingjun Du, Pascal Mettes, Cees G. M. Snoek

This paper investigates the problem of scene graph generation in videos with the aim of capturing semantic relations between subjects and objects in the form of $\langle$subject, predicate, object$\rangle$ triplets.

Graph Generation Meta-Learning +1

EMO: Episodic Memory Optimization for Few-Shot Meta-Learning

no code implementations8 Jun 2023 Yingjun Du, Jiayi Shen, XianTong Zhen, Cees G. M. Snoek

By learning to retain and recall the learning process of past training tasks, EMO nudges parameter updates in the right direction, even when the gradients provided by a limited number of examples are uninformative.

Few-Shot Learning

MetaModulation: Learning Variational Feature Hierarchies for Few-Shot Learning with Fewer Tasks

1 code implementation17 May 2023 Wenfang Sun, Yingjun Du, XianTong Zhen, Fan Wang, Ling Wang, Cees G. M. Snoek

To account for the uncertainty caused by the limited training tasks, we propose a variational MetaModulation where the modulation parameters are treated as latent variables.

Few-Shot Learning

Hierarchical Variational Memory for Few-shot Learning Across Domains

1 code implementation ICLR 2022 Yingjun Du, XianTong Zhen, Ling Shao, Cees G. M. Snoek

To explore and exploit the importance of different semantic levels, we further propose to learn the weights associated with the prototype at each level in a data-driven way, which enables the model to adaptively choose the most generalizable features.

Few-Shot Learning Variational Inference

Meta-Learning with Variational Semantic Memory for Word Sense Disambiguation

no code implementations ACL 2021 Yingjun Du, Nithin Holla, XianTong Zhen, Cees G. M. Snoek, Ekaterina Shutova

A critical challenge faced by supervised word sense disambiguation (WSD) is the lack of large annotated datasets with sufficient coverage of words in their diversity of senses.

Meta-Learning Variational Inference +1

MetaKernel: Learning Variational Random Features with Limited Labels

no code implementations8 May 2021 Yingjun Du, Haoliang Sun, XianTong Zhen, Jun Xu, Yilong Yin, Ling Shao, Cees G. M. Snoek

Specifically, we propose learning variational random features in a data-driven manner to obtain task-specific kernels by leveraging the shared knowledge provided by related tasks in a meta-learning setting.

Few-Shot Image Classification Few-Shot Learning +1

Learning to Learn Variational Semantic Memory

1 code implementation NeurIPS 2020 XianTong Zhen, Yingjun Du, Huan Xiong, Qiang Qiu, Cees G. M. Snoek, Ling Shao

The variational semantic memory accrues and stores semantic information for the probabilistic inference of class prototypes in a hierarchical Bayesian framework.

Few-Shot Learning General Knowledge +1

Cannot find the paper you are looking for? You can Submit a new open access paper.