Search Results for author: Yaxing Wang

Found 23 papers, 16 papers with code

Attracting and Dispersing: A Simple Approach for Source-free Domain Adaptation

1 code implementation9 May 2022 Shiqi Yang, Yaxing Wang, Kai Wang, Shangling Jui, Joost Van de Weijer

Treating SFDA as an unsupervised clustering problem and following the intuition that local neighbors in feature space should have more similar predictions than other features, we propose to optimize an objective of prediction consistency.

Domain Adaptation

A Novel Framework for Image-to-image Translation and Image Compression

no code implementations25 Nov 2021 Fei Yang, Yaxing Wang, Luis Herranz, Yongmei Cheng, Mikhail Mozerov

Thus, we further propose a unified framework that allows both translation and autoencoding capabilities in a single codec.

Image Compression Image Restoration +4

Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation

1 code implementation NeurIPS 2021 Shiqi Yang, Yaxing Wang, Joost Van de Weijer, Luis Herranz, Shangling Jui

In this paper, we address the challenging source-free domain adaptation (SFDA) problem, where the source pretrained model is adapted to the target domain in the absence of source data.

Domain Adaptation

Distilling GANs with Style-Mixed Triplets for X2I Translation with Limited Data

no code implementations ICLR 2022 Yaxing Wang, Joost Van de Weijer, Lu Yu, Shangling Jui

Therefore, we investigate knowledge distillation to transfer knowledge from a high-quality unconditioned generative model (e. g., StyleGAN) to a conditioned synthetic image generation modules in a variety of systems.

Image Generation Knowledge Distillation +2

Generalized Source-free Domain Adaptation

1 code implementation ICCV 2021 Shiqi Yang, Yaxing Wang, Joost Van de Weijer, Luis Herranz, Shangling Jui

In this paper, we propose a new domain adaptation paradigm called Generalized Source-free Domain Adaptation (G-SFDA), where the learned model needs to perform well on both the target and source domains, with only access to current unlabeled target data during adaptation.

Domain Adaptation

MineGAN++: Mining Generative Models for Efficient Knowledge Transfer to Limited Data Domains

1 code implementation28 Apr 2021 Yaxing Wang, Abel Gonzalez-Garcia, Chenshen Wu, Luis Herranz, Fahad Shahbaz Khan, Shangling Jui, Joost Van de Weijer

Therefore, we propose a novel knowledge transfer method for generative models based on mining the knowledge that is most beneficial to a specific target domain, either from a single or multiple pretrained GANs.

Transfer Learning

DeepI2I: Enabling Deep Hierarchical Image-to-Image Translation by Transferring from GANs

1 code implementation NeurIPS 2020 Yaxing Wang, Lu Yu, Joost Van de Weijer

To enable the training of deep I2I models on small datasets, we propose a novel transfer learning method, that transfers knowledge from pre-trained GANs.

Image-to-Image Translation Transfer Learning +1

Casting a BAIT for Offline and Online Source-free Domain Adaptation

1 code implementation23 Oct 2020 Shiqi Yang, Yaxing Wang, Joost Van de Weijer, Luis Herranz, Shangling Jui

When adapting to the target domain, the additional classifier initialized from source classifier is expected to find misclassified features.

Unsupervised Domain Adaptation

GANwriting: Content-Conditioned Generation of Styled Handwritten Word Images

3 code implementations ECCV 2020 Lei Kang, Pau Riba, Yaxing Wang, Marçal Rusiñol, Alicia Fornés, Mauricio Villegas

We propose a novel method that is able to produce credible handwritten word images by conditioning the generative process with both calligraphic style features and textual content.

Handwritten Word Generation

MineGAN: effective knowledge transfer from GANs to target domains with few images

2 code implementations CVPR 2020 Yaxing Wang, Abel Gonzalez-Garcia, David Berga, Luis Herranz, Fahad Shahbaz Khan, Joost Van de Weijer

We propose a novel knowledge transfer method for generative models based on mining the knowledge that is most beneficial to a specific target domain, either from a single or multiple pretrained GANs.

Transfer Learning

Controlling biases and diversity in diverse image-to-image translation

no code implementations23 Jul 2019 Yaxing Wang, Abel Gonzalez-Garcia, Joost Van de Weijer, Luis Herranz

The task of unpaired image-to-image translation is highly challenging due to the lack of explicit cross-domain pairs of instances.

Image-to-Image Translation Translation

Memory Replay GANs: Learning to Generate New Categories without Forgetting

1 code implementation NeurIPS 2018 Chenshen Wu, Luis Herranz, Xialei Liu, Yaxing Wang, Joost Van de Weijer, Bogdan Raducanu

In particular, we investigate generative adversarial networks (GANs) in the task of learning new categories in a sequential fashion.

Memory Replay GANs: learning to generate images from new categories without forgetting

1 code implementation6 Sep 2018 Chenshen Wu, Luis Herranz, Xialei Liu, Yaxing Wang, Joost Van de Weijer, Bogdan Raducanu

In particular, we investigate generative adversarial networks (GANs) in the task of learning new categories in a sequential fashion.

Transferring GANs: generating images from limited data

1 code implementation ECCV 2018 Yaxing Wang, Chenshen Wu, Luis Herranz, Joost Van de Weijer, Abel Gonzalez-Garcia, Bogdan Raducanu

Transferring the knowledge of pretrained networks to new domains by means of finetuning is a widely used practice for applications based on discriminative models.

Domain Adaptation Image Generation

Mix and match networks: encoder-decoder alignment for zero-pair image translation

1 code implementation CVPR 2018 Yaxing Wang, Joost Van de Weijer, Luis Herranz

We address the problem of image translation between domains or modalities for which no direct paired data is available (i. e. zero-pair translation).

Colorization Semantic Segmentation +2

Ensembles of Generative Adversarial Networks

no code implementations3 Dec 2016 Yaxing Wang, Lichao Zhang, Joost Van de Weijer

The first one is based on the fact that in the minimax game which is played to optimize the GAN objective the generator network keeps on changing even after the network can be considered optimal.

Cannot find the paper you are looking for? You can Submit a new open access paper.