Search Results for author: Fu-Yun Wang

Found 8 papers, 7 papers with code

Rethinking the Spatial Inconsistency in Classifier-Free Diffusion Guidance

2 code implementations8 Apr 2024 Dazhong Shen, Guanglu Song, Zeyue Xue, Fu-Yun Wang, Yu Liu

Classifier-Free Guidance (CFG) has been widely used in text-to-image diffusion models, where the CFG scale is introduced to control the strength of text guidance on the whole image space.

Denoising Semantic Segmentation

Be-Your-Outpainter: Mastering Video Outpainting through Input-Specific Adaptation

1 code implementation20 Mar 2024 Fu-Yun Wang, Xiaoshi Wu, Zhaoyang Huang, Xiaoyu Shi, Dazhong Shen, Guanglu Song, Yu Liu, Hongsheng Li

We introduce MOTIA Mastering Video Outpainting Through Input-Specific Adaptation, a diffusion-based pipeline that leverages both the intrinsic data-specific patterns of the source video and the image/video generative prior for effective outpainting.

Gen-L-Video: Multi-Text to Long Video Generation via Temporal Co-Denoising

1 code implementation29 May 2023 Fu-Yun Wang, Wenshuo Chen, Guanglu Song, Han-Jia Ye, Yu Liu, Hongsheng Li

To address this challenge, we introduce a novel paradigm dubbed as Gen-L-Video, capable of extending off-the-shelf short video diffusion models for generating and editing videos comprising hundreds of frames with diverse semantic segments without introducing additional training, all while preserving content consistency.

Denoising Image Generation +2

Forward Compatible Few-Shot Class-Incremental Learning

1 code implementation CVPR 2022 Da-Wei Zhou, Fu-Yun Wang, Han-Jia Ye, Liang Ma, ShiLiang Pu, De-Chuan Zhan

Forward compatibility requires future new classes to be easily incorporated into the current model based on the current stage data, and we seek to realize it by reserving embedding space for future new classes.

Few-Shot Class-Incremental Learning Incremental Learning

PyCIL: A Python Toolbox for Class-Incremental Learning

1 code implementation23 Dec 2021 Da-Wei Zhou, Fu-Yun Wang, Han-Jia Ye, De-Chuan Zhan

Traditional machine learning systems are deployed under the closed-world setting, which requires the entire training data before the offline training process.

BIG-bench Machine Learning Class Incremental Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.