no code implementations • 18 Mar 2024 • Yue Fan, Xiaojian Ma, Rujie Wu, Yuntao Du, Jiaqi Li, Zhi Gao, Qing Li
We explore how reconciling several foundation models (large language models and vision-language models) with a novel unified memory mechanism could tackle the challenging video understanding problem, especially capturing the long-term temporal relations in lengthy videos.
1 code implementation • 9 Feb 2024 • Yuntao Du, Ninghui Li
Data synthesis has been advocated as an important approach for utilizing data while protecting data privacy.
1 code implementation • 3 Feb 2024 • Yi Xin, Siqi Luo, Haodi Zhou, Junlong Du, Xiaohong Liu, Yue Fan, Qing Li, Yuntao Du
Large-scale pre-trained vision models (PVMs) have shown great potential for adaptability across various downstream vision tasks.
no code implementations • 18 Dec 2023 • Zhi Gao, Yuntao Du, Xintong Zhang, Xiaojian Ma, Wenjuan Han, Song-Chun Zhu, Qing Li
However, these methods often overlook the potential for continual learning, typically by freezing the utilized tools, thus limiting their adaptation to environments requiring new knowledge.
no code implementations • 31 Jul 2023 • Mingcai Chen, Yuntao Du, Wei Tang, Baoming Zhang, Hao Cheng, Shuwei Qian, Chongjun Wang
We introduce LaplaceConfidence, a method that to obtain label confidence (i. e., clean probabilities) utilizing the Laplacian energy.
1 code implementation • 28 Apr 2023 • Xinjun Zhu, Yuntao Du, YUREN MAO, Lu Chen, Yujia Hu, Yunjun Gao
Knowledge graph (KG), which contains rich side information, becomes an essential part to boost the recommendation performance and improve its explainability.
1 code implementation • 27 Apr 2023 • Yuntao Du, Jianxun Lian, Jing Yao, Xiting Wang, Mingqi Wu, Lu Chen, Yunjun Gao, Xing Xie
In recent decades, there have been significant advancements in latent embedding-based CF methods for improved accuracy, such as matrix factorization, neural collaborative filtering, and LightGCN.
1 code implementation • 6 Oct 2022 • Le Zhao, Mingcai Chen, Yuntao Du, Haiyang Yang, Chongjun Wang
We design an attention module to capture long-term dependency by mining periodic information in traffic data.
2 code implementations • 14 Apr 2022 • Yunjun Gao, Yuntao Du, Yujia Hu, Lu Chen, Xinjun Zhu, Ziquan Fang, Baihua Zheng
Besides, our method can automatically switch its learning phase at the memorization point from memorization to self-guided learning, and select clean and informative memorized data via a novel adaptive denoising scheduler to improve the robustness.
1 code implementation • 11 Apr 2022 • Yuntao Du, Xinjun Zhu, Lu Chen, Baihua Zheng, Yunjun Gao
Furthermore, we propose a dual item embeddings design to represent and propagate collaborative signals and knowledge associations separately, and leverage the gated aggregation to distill discriminative information for better capturing user behavior patterns.
Ranked #1 on Recommendation Systems on Alibaba-iFashion
1 code implementation • 8 Feb 2022 • Yuntao Du, Xinjun Zhu, Lu Chen, Ziquan Fang, Yunjun Gao
Inspired by the success of meta-learning on scarce training samples, we propose a novel meta-learning based framework called MetaKG, which encompasses a collaborative-aware meta learner and a knowledge-aware meta learner, to capture meta users' preference and entities' knowledge for cold-start recommendations.
1 code implementation • 17 Dec 2021 • Ziquan Fang, Yuntao Du, Xinjun Zhu, Lu Chen, Yunjun Gao, Christian S. Jensen
Trajectory similarity computation has drawn massive attention, as it is core functionality in a wide range of applications such as ride-sharing, traffic analysis, and social recommendation.
no code implementations • 6 Dec 2021 • Mingcai Chen, Hao Cheng, Yuntao Du, Ming Xu, Wenyu Jiang, Chongjun Wang
We show that our method successfully alleviates the damage of both label noise and confirmation bias.
Ranked #2 on Image Classification on mini WebVision 1.0
1 code implementation • 13 Oct 2021 • Minjun Zhao, Lu Chen, Keyu Yang, Yuntao Du, Yunjun Gao
It uses a Gaussian mixture-based metric called separation degree to rank materialized models.
no code implementations • 9 Sep 2021 • Yuntao Du, Haiyang Yang, Mingcai Chen, Juan Jiang, Hongtao Luo, Chongjun Wang
The proposed method firstly generates and augments the pseudo-source domain, and then employs distribution alignment with four novel losses based on pseudo-label based strategy.
2 code implementations • 10 Aug 2021 • Yuntao Du, Jindong Wang, Wenjie Feng, Sinno Pan, Tao Qin, Renjun Xu, Chongjun Wang
This paper proposes Adaptive RNNs (AdaRNN) to tackle the TCS problem by building an adaptive model that generalizes well on the unseen test data.
1 code implementation • 10 Jul 2021 • Mingcai Chen, Yuntao Du, Yi Zhang, Shuwei Qian, Chongjun Wang
Co-training, extended from self-training, is one of the frameworks for semi-supervised learning.
1 code implementation • 29 Jun 2021 • Yuntao Du, Yinghao Chen, Fengli Cui, Xiaowen Zhang, Chongjun Wang
Unsupervised domain adaptation aims to transfer knowledge from a labeled source domain to an unlabeled target domain.
no code implementations • 26 Mar 2020 • Yuntao Du, Ruiting Zhang, Xiaowen Zhang, Yirong Yao, Hengyang Lu, Chongjun Wang
In this paper, a novel method called \textit{learning TransFerable and Discriminative Features for unsupervised domain adaptation} (TFDF) is proposed to optimize these two objectives simultaneously.
1 code implementation • 1 Jan 2020 • Yuntao Du, Zhiwen Tan, Qian Chen, Xiaowen Zhang, Yirong Yao, Chongjun Wang
Recent experiments have shown that when the discriminator is provided with domain information in both domains and label information in the source domain, it is able to preserve the complex multimodal information and high semantic information in both domains.
Ranked #5 on Domain Adaptation on ImageCLEF-DA
1 code implementation • 31 Dec 2019 • Yuntao Du, Zhiwen Tan, Qian Chen, Yi Zhang, Chongjun Wang
In this paper, we propose a novel online transfer learning method which seeks to find a new feature representation, so that the marginal distribution and conditional distribution discrepancy can be online reduced simultaneously.