no code implementations • 12 Apr 2022 • Yuan Tian, Klaus-Rudolf Kladny, Qin Wang, Zhiwu Huang, Olga Fink
In this paper, we propose to exploit the fact that the agents seek to improve their expected cumulative reward and introduce a novel \textit{Time Dynamical Opponent Model} (TDOM) to encode the knowledge that the opponent policies tend to improve over time.
1 code implementation • 25 Mar 2022 • Qin Wang, Olga Fink, Luc van Gool, Dengxin Dai
However, real-world machine perception systems are running in non-stationary and continually changing environments where the target domain distribution can change over time.
no code implementations • 13 Mar 2022 • Feiyu Wang, Qin Wang, Wen Li, Dong Xu, Luc van Gool
Benefited from this new perspective, we first propose a new deep semi-supervised learning framework called Semi-supervised Learning by Empirical Distribution Alignment (SLEDA), in which existing technologies from the domain adaptation community can be readily used to address the semi-supervised learning problem through reducing the empirical distribution distance between labeled and unlabeled data.
1 code implementation • 28 Aug 2021 • Lukas Hoyer, Dengxin Dai, Qin Wang, Yuhua Chen, Luc van Gool
Training deep networks for semantic segmentation requires large amounts of labeled training data, which presents a major challenge in practice, as labeling segmentation masks is a highly labor-intensive process.
no code implementations • 5 Aug 2021 • Qin Wang, Hui Che, Weizhen Ding, Li Xiang, Guanbin Li, Zhen Li, Shuguang Cui
Thus, we propose a novel framework based on a teacher-student architecture for the accurate colorectal polyp classification (CPC) through directly using white-light (WL) colonoscopy images in the examination.
no code implementations • 5 Aug 2021 • Qin Wang, Jun Wei, Boyuan Wang, Zhen Li1, Sheng Wang, Shuguang Cu
Protein secondary structure prediction (PSSP) is essential for protein function analysis.
1 code implementation • CVPR 2021 • Jun Wei, Qin Wang, Zhen Li, Sheng Wang, S. Kevin Zhou, Shuguang Cui
In practice, our SPOL model first generates the CAMs through a novel element-wise multiplication of shallow and deep feature maps, which filters the background noise and generates sharper boundaries robustly.
1 code implementation • 5 Jul 2021 • Qin Wang, Cees Taal, Olga Fink
In this paper, we aim to overcome this limitation by integrating expert knowledge with domain adaptation in a synthetic-to-real framework for unsupervised fault diagnosis.
1 code implementation • ICCV 2021 • Qin Wang, Dengxin Dai, Lukas Hoyer, Luc van Gool, Olga Fink
However, such a supervision is not always available.
Ranked #8 on
Domain Adaptation
on SYNTHIA-to-Cityscapes
(using extra training data)
no code implementations • 24 Feb 2021 • Kun Liu, Tongjun Liu, Wei Fang, Jian Li, Qin Wang
Quantum correlation is a fundamental property which distinguishes quantum systems from classical ones, and it is also a fragile resource under projective measurement.
Quantum Physics
no code implementations • 1 Feb 2021 • Qin Wang, Rujia Li
We apply this consensus algorithm to construct a high-performance blockchain system, called \textit{Sphinx}.
Distributed, Parallel, and Cluster Computing
1 code implementation • ECCV 2020 • Yuan Tian, Qin Wang, Zhiwu Huang, Wen Li, Dengxin Dai, Minghao Yang, Jun Wang, Olga Fink
In this paper, we introduce a new reinforcement learning (RL) based neural architecture search (NAS) methodology for effective and efficient generative adversarial network (GAN) architecture search.
Ranked #9 on
Image Generation
on STL-10
no code implementations • 5 May 2020 • Olga Fink, Qin Wang, Markus Svensén, Pierre Dersin, Wan-Jui Lee, Melanie Ducoffe
Deep learning applications have been thriving over the last decade in many different domains, including computer vision and natural language understanding.
3 code implementations • 7 Jan 2020 • Qin Wang, Gabriel Michau, Olga Fink
We demonstrate in this paper that the performance of domain adversarial methods can be vulnerable to an incomplete target label space during training.
1 code implementation • 17 Dec 2019 • Israel Goytom, Qin Wang, Tianxiang Yu, Kunjie Dai, Kris Sankaran, Xinfei Zhou, Dongdong Lin
Microscopy images are powerful tools and widely used in the majority of research areas, such as biology, chemistry, physics and materials fields by various microscopies (scanning electron microscope (SEM), atomic force microscope (AFM) and the optical microscope, et al.).
no code implementations • 17 Nov 2019 • Kunjin Chen, Yu Zhang, Qin Wang, Jun Hu, Hang Fan, Jinliang He
Non-intrusive load monitoring addresses the challenging task of decomposing the aggregate signal of a household's electricity consumption into appliance-level data without installing dedicated meters.
1 code implementation • ICCV 2019 • Qin Wang, Wen Li, Luc van Gool
We reveal that an essential sampling bias exists in semi-supervised learning due to the limited number of labeled samples, which often leads to a considerable empirical distribution mismatch between labeled data and unlabeled data.
no code implementations • 15 May 2019 • Qin Wang, Gabriel Michau, Olga Fink
Thanks to digitization of industrial assets in fleets, the ambitious goal of transferring fault diagnosis models fromone machine to the other has raised great interest.
no code implementations • 6 Jun 2018 • Kunjin Chen, Qin Wang, Ziyu He, Kunlong Chen, Jun Hu, Jinliang He
A convolutional sequence to sequence non-intrusive load monitoring model is proposed in this paper.
1 code implementation • 30 May 2018 • Kunjin Chen, Kunlong Chen, Qin Wang, Ziyu He, Jun Hu, Jinliang He
We present in this paper a model for forecasting short-term power loads based on deep residual networks.
no code implementations • 13 Feb 2018 • Qin Wang, Sandro Schoenborn, Yvonne-Anne Pignolet, Theo Widmer, Carsten Franke
Currently, engineers at substation service providers match customer data with the corresponding internally used signal names manually.
no code implementations • ECCV 2018 • Wonmin Byeon, Qin Wang, Rupesh Kumar Srivastava, Petros Koumoutsakos
Video prediction models based on convolutional networks, recurrent networks, and their combinations often result in blurry predictions.