1 code implementation • 27 Aug 2024 • Baijiong Lin, Weisen Jiang, Pengguang Chen, Shu Liu, Ying-Cong Chen
Multi-task dense scene understanding, which trains a model for multiple dense prediction tasks, has a wide range of application scenarios.
1 code implementation • 2 Jul 2024 • Baijiong Lin, Weisen Jiang, Pengguang Chen, Yu Zhang, Shu Liu, Ying-Cong Chen
Multi-task dense scene understanding, which learns a model for multiple dense prediction tasks, has a wide range of application scenarios.
no code implementations • 20 Jun 2024 • Zhongshen Zeng, Yinhong Liu, Yingjia Wan, Jingyao Li, Pengguang Chen, Jianbo Dai, Yuxuan Yao, Rongwu Xu, Zehan Qi, Wanru Zhao, Linling Shen, Jianqiao Lu, Haochen Tan, Yukang Chen, Hao Zhang, Zhan Shi, Bailin Wang, Zhijiang Guo, Jiaya Jia
Large language models (LLMs) have shown increasing capability in problem-solving and decision-making, largely based on the step-by-step chain-of-thought reasoning processes.
no code implementations • 6 Jun 2024 • Jingyao Li, Pengguang Chen, Sitong Wu, Chuanyang Zheng, Hong Xu, Jiaya Jia
To address these limitations, the RoboCoder framework integrates Large Language Models (LLMs) with a dynamic learning system that uses real-time environmental feedback to continuously update and refine action codes.
no code implementations • 22 Feb 2024 • Jingyao Li, Pengguang Chen, Xuan Ju, Hong Xu, Jiaya Jia
Our research aims to bridge the domain gap between natural and artificial scenarios with efficient tuning strategies.
1 code implementation • 5 Jan 2024 • Jingyao Li, Pengguang Chen, Shaozuo Yu, Shu Liu, Jiaya Jia
The crux of effective out-of-distribution (OOD) detection lies in acquiring a robust in-distribution (ID) representation, distinct from OOD samples.
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
2 code implementations • 28 Dec 2023 • Zhongshen Zeng, Pengguang Chen, Shu Liu, Haiyun Jiang, Jiaya Jia
In this work, we introduce a novel evaluation paradigm for Large Language Models (LLMs) that compels them to transition from a traditional question-answering role, akin to a student, to a solution-scoring role, akin to a teacher.
1 code implementation • 26 Dec 2023 • Jingyao Li, Pengguang Chen, Shaozuo Yu, Shu Liu, Jiaya Jia
Experimental results demonstrate that, when labeling 80% of the samples, the performance of the current SOTA method declines by 0. 74%, whereas our proposed BAL achieves performance comparable to the full dataset.
1 code implementation • 26 Dec 2023 • Jingyao Li, Pengguang Chen, Bin Xia, Hong Xu, Jiaya Jia
Large Language Models (LLMs) have showcased impressive capabilities in handling straightforward programming tasks.
Ranked #3 on
Code Generation
on APPS
2 code implementations • 26 Oct 2023 • Shuai Yang, Zhifei Chen, Pengguang Chen, Xi Fang, Yixun Liang, Shu Liu, Yingcong Chen
Defect inspection is paramount within the closed-loop manufacturing system.
1 code implementation • 23 Aug 2023 • Baijiong Lin, Weisen Jiang, Feiyang Ye, Yu Zhang, Pengguang Chen, Ying-Cong Chen, Shu Liu, James T. Kwok
Multi-task learning (MTL), a learning paradigm to learn multiple related tasks simultaneously, has achieved great success in various fields.
1 code implementation • 15 Apr 2023 • Jingyao Li, Pengguang Chen, Shengju Qian, Shu Liu, Jiaya Jia
Contrastive Language-Image Pre-training (CLIP) has recently shown great promise in pixel-level zero-shot learning tasks.
2 code implementations • CVPR 2023 • Jingyao Li, Pengguang Chen, Shaozuo Yu, Zexin He, Shu Liu, Jiaya Jia
The core of out-of-distribution (OOD) detection is to learn the in-distribution (ID) representation, which is distinguishable from OOD samples.
Ranked #12 on
Out-of-Distribution Detection
on ImageNet-1k vs Places
(AUROC metric)
no code implementations • 2 Mar 2022 • Yixin Chen, Zhuotao Tian, Pengguang Chen, Shu Liu, Jiaya Jia
We revisit the one- and two-stage detector distillation tasks and present a simple and efficient semantic-aware framework to fill the gap between them.
1 code implementation • 15 Oct 2021 • Yinpeng Dong, Qi-An Fu, Xiao Yang, Wenzhao Xiang, Tianyu Pang, Hang Su, Jun Zhu, Jiayu Tang, Yuefeng Chen, Xiaofeng Mao, Yuan He, Hui Xue, Chao Li, Ye Liu, Qilong Zhang, Lianli Gao, Yunrui Yu, Xitong Gao, Zhe Zhao, Daquan Lin, Jiadong Lin, Chuanbiao Song, ZiHao Wang, Zhennan Wu, Yang Guo, Jiequan Cui, Xiaogang Xu, Pengguang Chen
Due to the vulnerability of deep neural networks (DNNs) to adversarial examples, a large number of defense techniques have been proposed to alleviate this problem in recent years.
1 code implementation • ICCV 2021 • Yixin Chen, Pengguang Chen, Shu Liu, LiWei Wang, Jiaya Jia
Effectively structuring deep knowledge plays a pivotal role in transfer from teacher to student, especially in semantic vision tasks.
no code implementations • 30 Aug 2021 • Pengguang Chen, Yixin Chen, Shu Liu, MingChang Yang, Jiaya Jia
We analyze the reason behind this phenomenon, and propose a novel irregular patch embedding module and adaptive patch fusion module to improve the performance.
7 code implementations • CVPR 2021 • Pengguang Chen, Shu Liu, Hengshuang Zhao, Jiaya Jia
Knowledge distillation transfers knowledge from the teacher network to the student one, with the goal of greatly improving the performance of the student network.
Ranked #13 on
Knowledge Distillation
on CIFAR-100
1 code implementation • CVPR 2021 • Pengguang Chen, Shu Liu, Jiaya Jia
It is even comparable to the contrastive learning methods when only half of training batches are used.
7 code implementations • 13 Jan 2020 • Pengguang Chen, Shu Liu, Hengshuang Zhao, Xingquan Wang, Jiaya Jia
Then we show limitation of existing information dropping algorithms and propose our structured method, which is simple and yet very effective.