Search Results for author: Ling Zhou

Found 11 papers, 2 papers with code

Region attention and graph embedding network for occlusion objective class-based micro-expression recognition

no code implementations13 Jul 2021 Qirong Mao, Ling Zhou, Wenming Zheng, Xiuyan Shao, Xiaohua Huang

More specifically, the backbone network aims at extracting feature representations from different facial regions, RI module computing an adaptive weight from the region itself based on attention mechanism with respect to the unobstructedness and importance for suppressing the influence of occlusion, and RR module exploiting the progressive interactions among these regions by performing graph convolutions.

Graph Embedding Micro-Expression Recognition

Feature refinement: An expression-specific feature learning and fusion method for micro-expression recognition

no code implementations13 Jan 2021 Ling Zhou, Qirong Mao, Xiaohua Huang, Feifei Zhang, Zhihong Zhang

It aims to obtain salient and discriminative features for specific expressions and also predict expression by fusing the expression-specific features.

Micro-Expression Recognition Optical Flow Estimation

Objective Class-based Micro-Expression Recognition through Simultaneous Action Unit Detection and Feature Aggregation

no code implementations24 Dec 2020 Ling Zhou, Qirong Mao, Ming Dong

Specifically, we propose two new strategies in our AU detection module for more effective AU feature learning: the attention mechanism and the balanced detection loss function.

Action Unit Detection Micro-Expression Recognition

Pattern-Structure Diffusion for Multi-Task Learning

no code implementations CVPR 2020 Ling Zhou, Zhen Cui, Chunyan Xu, Zhenyu Zhang, Chaoqun Wang, Tong Zhang, Jian Yang

Inspired by the observation that pattern structures high-frequently recur within intra-task also across tasks, we propose a pattern-structure diffusion (PSD) framework to mine and propagate task-specific and task-across pattern structures in the task-level space for joint depth estimation, segmentation and surface normal prediction.

Depth Estimation Multi-Task Learning

Persistent Homotopy Groups of Metric Spaces

1 code implementation28 Dec 2019 Facundo Mémoli, Ling Zhou

We study notions of persistent homotopy groups of compact metric spaces together with their stability properties in the Gromov-Hausdorff sense.

Algebraic Topology Computational Geometry

Method of Contraction-Expansion (MOCE) for Simultaneous Inference in Linear Models

no code implementations4 Aug 2019 Fei Wang, Ling Zhou, Lu Tang, Peter X. -K. Song

To establish a simultaneous post-model selection inference, we propose a method of contraction and expansion (MOCE) along the line of debiasing estimation that enables us to balance the bias-and-variance trade-off so that the super-sparsity assumption may be relaxed.

Model Selection

Understanding over-parameterized deep networks by geometrization

no code implementations11 Feb 2019 Xiao Dong, Ling Zhou

This can be regarded as a strong support of our proposal that geometrization is not only the bible for physics, it is also the key idea to understand deep learning systems.

Geometrization of deep networks for the interpretability of deep learning systems

no code implementations6 Jan 2019 Xiao Dong, Ling Zhou

By comparing the geometry of image matching and deep networks, we show that geometrization of deep networks can be used to understand existing deep learning systems and it may also help to solve the interpretability problem of deep learning systems.

Reducing Parameter Space for Neural Network Training

1 code implementation22 May 2018 Tong Qin, Ling Zhou, Dongbin Xiu

For neural networks (NNs) with rectified linear unit (ReLU) or binary activation functions, we show that their training can be accomplished in a reduced parameter space.

Demystifying AlphaGo Zero as AlphaGo GAN

no code implementations24 Nov 2017 Xiao Dong, Jiasong Wu, Ling Zhou

The astonishing success of AlphaGo Zero\cite{Silver_AlphaGo} invokes a worldwide discussion of the future of our human society with a mixed mood of hope, anxiousness, excitement and fear.

How deep learning works --The geometry of deep learning

no code implementations30 Oct 2017 Xiao Dong, Jiasong Wu, Ling Zhou

Why and how that deep learning works well on different tasks remains a mystery from a theoretical perspective.

Template Matching

Cannot find the paper you are looking for? You can Submit a new open access paper.