On Dropping Clusters to Regularize Graph Convolutional Neural Networks

ECCV 2020  ·  Xikun Zhang, Chang Xu, DaCheng Tao ·

Dropout has been widely adopted to regularize graph convolutional networks (GCNs) by randomly zeroing entries of the node feature vectors and obtains promising performance on various tasks. However, the information of individually zeroed entries could still present in other correlated entries by propagating (1) spatially between entries of different node feature vectors and (2) depth-wisely between different entries of each node feature vector, which essentially weakens the effectiveness of dropout. This is mainly because in a GCN, neighboring node feature vectors after linear transformations are aggregated to produce new node feature vectors in the subsequent layer. To effectively regularize GCNs, we devise DropCluster which first randomly zeros some seed entries and then zeros entries that are spatially or depth-wisely correlated to those seed entries. In this way, the information of the seed entries is thoroughly removed and cannot flow to subsequent layers via the correlated entries. We validate the effectiveness of the proposed DropCluster by comprehensively comparing it with dropout and its representative variants, such as SpatialDropout, Gaussian dropout and DropEdge, on skeleton-based action recognition.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods