Search Results for author: Xinjiang Wang

Found 17 papers, 11 papers with code

What Are Expected Queries in End-to-End Object Detection?

1 code implementation2 Jun 2022 Shilong Zhang, Xinjiang Wang, Jiaqi Wang, Jiangmiao Pang, Kai Chen

As both sparse and dense queries are imperfect, then \emph{what are expected queries in end-to-end object detection}?

Instance Segmentation object-detection +2

Group R-CNN for Weakly Semi-supervised Object Detection with Points

1 code implementation CVPR 2022 Shilong Zhang, Zhuoran Yu, Liyang Liu, Xinjiang Wang, Aojun Zhou, Kai Chen

The core of this task is to train a point-to-box regressor on well-labeled images that can be used to predict credible bounding boxes for each point annotation.

Object Detection Representation Learning +1

Temporal RoI Align for Video Object Recognition

1 code implementation8 Sep 2021 Tao Gong, Kai Chen, Xinjiang Wang, Qi Chu, Feng Zhu, Dahua Lin, Nenghai Yu, Huamin Feng

In this work, considering the features of the same object instance are highly similar among frames in a video, a novel Temporal RoI Align operator is proposed to extract features from other frames feature maps for current frame proposals by utilizing feature similarity.

Instance Segmentation object-detection +4

WSSOD: A New Pipeline for Weakly- and Semi-Supervised Object Detection

no code implementations21 May 2021 Shijie Fang, Yuhang Cao, Xinjiang Wang, Kai Chen, Dahua Lin, Wayne Zhang

The performance of object detection, to a great extent, depends on the availability of large annotated datasets.

object-detection Object Detection +2

Rethinking the Pruning Criteria for Convolutional Neural Network

no code implementations NeurIPS 2021 Zhongzhan Huang, Xinjiang Wang, Ping Luo

Channel pruning is a popular technique for compressing convolutional neural networks (CNNs), and various pruning criteria have been proposed to remove the redundant filters of CNNs.

Understanding the wiring evolution in differentiable neural architecture search

1 code implementation2 Sep 2020 Sirui Xie, Shoukang Hu, Xinjiang Wang, Chunxiao Liu, Jianping Shi, Xunying Liu, Dahua Lin

To this end, we pose questions that future differentiable methods for neural wiring discovery need to confront, hoping to evoke a discussion and rethinking on how much bias has been enforced implicitly in existing NAS methods.

Neural Architecture Search

Scale-Equalizing Pyramid Convolution for Object Detection

2 code implementations CVPR 2020 Xinjiang Wang, Shilong Zhang, Zhuoran Yu, Litong Feng, Wayne Zhang

Inspired by this, a convolution across the pyramid level is proposed in this study, which is termed pyramid convolution and is a modified 3-D convolution.

object-detection Object Detection

Convolution-Weight-Distribution Assumption: Rethinking the Criteria of Channel Pruning

no code implementations24 Apr 2020 Zhongzhan Huang, Wenqi Shao, Xinjiang Wang, Liang Lin, Ping Luo

Channel pruning is a popular technique for compressing convolutional neural networks (CNNs), where various pruning criteria have been proposed to remove the redundant filters.

AdaX: Adaptive Gradient Descent with Exponential Long Term Memory

1 code implementation21 Apr 2020 Wenjie Li, Zhaoyang Zhang, Xinjiang Wang, Ping Luo

Although adaptive optimization algorithms such as Adam show fast convergence in many machine learning tasks, this paper identifies a problem of Adam by analyzing its performance in a simple non-convex synthetic problem, showing that Adam's fast convergence would possibly lead the algorithm to local minimums.

How Does BN Increase Collapsed Neural Network Filters?

no code implementations30 Jan 2020 Sheng Zhou, Xinjiang Wang, Ping Luo, Litong Feng, Wenjie Li, Wei zhang

This phenomenon is caused by the normalization effect of BN, which induces a non-trainable region in the parameter space and reduces the network capacity as a result.

object-detection Object Detection

Kalman Normalization: Normalizing Internal Representations Across Network Layers

no code implementations NeurIPS 2018 Guangrun Wang, Jiefeng Peng, Ping Luo, Xinjiang Wang, Liang Lin

In this paper, we present a novel normalization method, called Kalman Normalization (KN), for improving and accelerating the training of DNNs, particularly under the context of micro-batches.

object-detection Object Detection

Towards Understanding Regularization in Batch Normalization

1 code implementation ICLR 2019 Ping Luo, Xinjiang Wang, Wenqi Shao, Zhanglin Peng

Batch Normalization (BN) improves both convergence and generalization in training neural networks.

Batch Kalman Normalization: Towards Training Deep Neural Networks with Micro-Batches

no code implementations9 Feb 2018 Guangrun Wang, Jiefeng Peng, Ping Luo, Xinjiang Wang, Liang Lin

As an indispensable component, Batch Normalization (BN) has successfully improved the training of deep neural networks (DNNs) with mini-batches, by normalizing the distribution of the internal representation for each hidden layer.

Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.