no code implementations • 31 Mar 2024 • Wenfang Sun, Yingjun Du, Gaowen Liu, Ramana Kompella, Cees G. M. Snoek
Additionally, we propose an assembly that merges the segmentation maps from the various subclass descriptors to ensure a more comprehensive representation of the different aspects in the test images.
1 code implementation • NeurIPS 2023 • Yingjun Du, Zehao Xiao, Shengcai Liao, Cees Snoek
Furthermore, we introduce a task-guided diffusion process within the prototype space, enabling the meta-learning of a generative process that transitions from a vanilla prototype to an overfitted prototype.
1 code implementation • 16 Jun 2023 • Shuo Chen, Yingjun Du, Pascal Mettes, Cees G. M. Snoek
This paper investigates the problem of scene graph generation in videos with the aim of capturing semantic relations between subjects and objects in the form of $\langle$subject, predicate, object$\rangle$ triplets.
no code implementations • 8 Jun 2023 • Yingjun Du, Jiayi Shen, XianTong Zhen, Cees G. M. Snoek
By learning to retain and recall the learning process of past training tasks, EMO nudges parameter updates in the right direction, even when the gradients provided by a limited number of examples are uninformative.
1 code implementation • 17 May 2023 • Wenfang Sun, Yingjun Du, XianTong Zhen, Fan Wang, Ling Wang, Cees G. M. Snoek
To account for the uncertainty caused by the limited training tasks, we propose a variational MetaModulation where the modulation parameters are treated as latent variables.
no code implementations • CVPR 2023 • Yingjun Du, Jiayi Shen, XianTong Zhen, Cees G. M. Snoek
Modern image classifiers perform well on populated classes, while degrading considerably on tail classes with only a few instances.
1 code implementation • ICLR 2022 • Yingjun Du, XianTong Zhen, Ling Shao, Cees G. M. Snoek
To explore and exploit the importance of different semantic levels, we further propose to learn the weights associated with the prototype at each level in a data-driven way, which enables the model to adaptively choose the most generalizable features.
no code implementations • ACL 2021 • Yingjun Du, Nithin Holla, XianTong Zhen, Cees G. M. Snoek, Ekaterina Shutova
A critical challenge faced by supervised word sense disambiguation (WSD) is the lack of large annotated datasets with sufficient coverage of words in their diversity of senses.
no code implementations • 8 May 2021 • Yingjun Du, Haoliang Sun, XianTong Zhen, Jun Xu, Yilong Yin, Ling Shao, Cees G. M. Snoek
Specifically, we propose learning variational random features in a data-driven manner to obtain task-specific kernels by leveraging the shared knowledge provided by related tasks in a meta-learning setting.
no code implementations • ICLR 2021 • Yingjun Du, XianTong Zhen, Ling Shao, Cees G. M. Snoek
Batch normalization plays a crucial role when training deep neural networks.
1 code implementation • NeurIPS 2020 • XianTong Zhen, Yingjun Du, Huan Xiong, Qiang Qiu, Cees G. M. Snoek, Ling Shao
The variational semantic memory accrues and stores semantic information for the probabilistic inference of class prototypes in a hierarchical Bayesian framework.