Search Results for author: Kang Zhang

Found 20 papers, 6 papers with code

Semi-Supervised Video Inpainting with Cycle Consistency Constraints

no code implementations14 Aug 2022 Zhiliang Wu, Hanyu Xuan, Changchang Sun, Kang Zhang, Yan Yan

Specifically, in this work, we propose an end-to-end trainable framework consisting of completion network and mask prediction network, which are designed to generate corrupted contents of the current frame using the known mask and decide the regions to be filled of the next frame, respectively.

Video Inpainting

On the Pros and Cons of Momentum Encoder in Self-Supervised Visual Representation Learning

no code implementations11 Aug 2022 Trung Pham, Chaoning Zhang, Axi Niu, Kang Zhang, Chang D. Yoo

Exponential Moving Average (EMA or momentum) is widely used in modern self-supervised learning (SSL) approaches, such as MoCo, for enhancing performance.

Representation Learning Self-Supervised Learning

A Survey on Masked Autoencoder for Self-supervised Learning in Vision and Beyond

no code implementations30 Jul 2022 Chaoning Zhang, Chenshuang Zhang, Junha Song, John Seon Keun Yi, Kang Zhang, In So Kweon

Masked autoencoders are scalable vision learners, as the title of MAE \cite{he2022masked}, which suggests that self-supervised learning (SSL) in vision might undertake a similar trajectory as in NLP.

Contrastive Learning Denoising +1

Decoupled Adversarial Contrastive Learning for Self-supervised Adversarial Robustness

2 code implementations22 Jul 2022 Chaoning Zhang, Kang Zhang, Chenshuang Zhang, Axi Niu, Jiu Feng, Chang D. Yoo, In So Kweon

Adversarial training (AT) for robust representation learning and self-supervised learning (SSL) for unsupervised representation learning are two active research fields.

Adversarial Robustness Contrastive Learning +3

Understanding and Improving Group Normalization

1 code implementation5 Jul 2022 Agus Gunawan, Xu Yin, Kang Zhang

Various normalization layers have been proposed to help the training of neural networks.

Image Classification

Dual Temperature Helps Contrastive Learning Without Many Negative Samples: Towards Understanding and Simplifying MoCo

2 code implementations CVPR 2022 Chaoning Zhang, Kang Zhang, Trung X. Pham, Axi Niu, Zhinan Qiao, Chang D. Yoo, In So Kweon

Contrastive learning (CL) is widely known to require many negative samples, 65536 in MoCo for instance, for which the performance of a dictionary-free framework is often inferior because the negative sample size (NSS) is limited by its mini-batch size (MBS).

Contrastive Learning

Investigating Top-$k$ White-Box and Transferable Black-box Attack

no code implementations30 Mar 2022 Chaoning Zhang, Philipp Benz, Adil Karjauv, Jae Won Cho, Kang Zhang, In So Kweon

It is widely reported that stronger I-FGSM transfers worse than simple FGSM, leading to a popular belief that transferability is at odds with the white-box attack strength.

Fast Adversarial Training with Noise Augmentation: A Unified Perspective on RandStart and GradAlign

no code implementations11 Feb 2022 Axi Niu, Kang Zhang, Chaoning Zhang, Chenshuang Zhang, In So Kweon, Chang D. Yoo, Yanning Zhang

The former works only for a relatively small perturbation 8/255 with the l_\infty constraint, and GradAlign improves it by extending the perturbation size to 16/255 (with the l_\infty constraint) but at the cost of being 3 to 4 times slower.

Data Augmentation

Investigating Top-k White-Box and Transferable Black-Box Attack

no code implementations CVPR 2022 Chaoning Zhang, Philipp Benz, Adil Karjauv, Jae Won Cho, Kang Zhang, In So Kweon

It is widely reported that stronger I-FGSM transfers worse than simple FGSM, leading to a popular belief that transferability is at odds with the white-box attack strength.

Early Stop And Adversarial Training Yield Better surrogate Model: Very Non-Robust Features Harm Adversarial Transferability

no code implementations29 Sep 2021 Chaoning Zhang, Gyusang Cho, Philipp Benz, Kang Zhang, Chenshuang Zhang, Chan-Hyun Youn, In So Kweon

The transferability of adversarial examples (AE); known as adversarial transferability, has attracted significant attention because it can be exploited for TransferableBlack-box Attacks (TBA).

Towards Personalized and Semantic Retrieval: An End-to-End Solution for E-commerce Search via Embedding Learning

no code implementations3 Jun 2020 Han Zhang, Songlin Wang, Kang Zhang, Zhiling Tang, Yunjiang Jiang, Yun Xiao, Weipeng Yan, Wen-Yun Yang

Two critical challenges stay in today's e-commerce search: how to retrieve items that are semantically relevant but not exact matching to query terms, and how to retrieve items that are more personalized to different users for the same search query.

Retrieval Semantic Retrieval

A Computational Model of Afterimages based on Simultaneous and Successive Contrasts

no code implementations13 Sep 2017 Jinhui Yu, Kailin Wu, Kang Zhang, Xianjun Sam Zheng

The colors of negative afterimages differ from the old stimulating colors in the original image when the color in the new area is either neutral or chromatic.

Bayesian regression and Bitcoin

15 code implementations6 Oct 2014 Devavrat Shah, Kang Zhang

In this paper, we discuss the method of Bayesian regression and its efficacy for predicting price variation of Bitcoin, a recently popularized virtual, cryptographic currency.

Bayesian Inference General Classification +1

Cross-Scale Cost Aggregation for Stereo Matching

1 code implementation CVPR 2014 Kang Zhang, Yuqiang Fang, Dongbo Min, Lifeng Sun, Shiqiang Yang. Shuicheng Yan, Qi Tian

We firstly reformulate cost aggregation from a unified optimization perspective and show that different cost aggregation methods essentially differ in the choices of similarity kernels.

Stereo Matching Stereo Matching Hand

Binary Stereo Matching

1 code implementation10 Feb 2014 Kang Zhang, Jiyang Li, Yijing Li, Weidong Hu, Lifeng Sun, Shiqiang Yang

In this paper, we propose a novel binary-based cost computation and aggregation approach for stereo matching problem.

Stereo Matching Stereo Matching Hand

Cannot find the paper you are looking for? You can Submit a new open access paper.