Search Results for author: Linguo Li

Found 5 papers, 2 papers with code

Representation-Agnostic Shape Fields

1 code implementation ICLR 2022 Xiaoyang Huang, Jiancheng Yang, Yanjun Wang, Ziyu Chen, Linguo Li, Teng Li, Bingbing Ni, Wenjun Zhang

In this study, we present Representation-Agnostic Shape Fields (RASF), a generalizable and computation-efficient shape embedding module for 3D deep learning.

3D Human Action Representation Learning via Cross-View Consistency Pursuit

1 code implementation CVPR 2021 Linguo Li, Minsi Wang, Bingbing Ni, Hang Wang, Jiancheng Yang, Wenjun Zhang

In this work, we propose a Cross-view Contrastive Learning framework for unsupervised 3D skeleton-based action Representation (CrosSCLR), by leveraging multi-view complementary supervision signal.

Action Recognition Contrastive Learning +1

Probabilistic Radiomics: Ambiguous Diagnosis with Controllable Shape Analysis

no code implementations20 Oct 2019 Jiancheng Yang, Rongyao Fang, Bingbing Ni, Yamin Li, Yi Xu, Linguo Li

The final diagnosis is obtained by combining the ambiguity prior sample and lesion representation, and the whole network named $DenseSharp^{+}$ is end-to-end trainable.

Probabilistic Deep Learning

Evaluating and Boosting Uncertainty Quantification in Classification

no code implementations13 Sep 2019 Xiaoyang Huang, Jiancheng Yang, Linguo Li, Haoran Deng, Bingbing Ni, Yi Xu

Emergence of artificial intelligence techniques in biomedical applications urges the researchers to pay more attention on the uncertainty quantification (UQ) in machine-assisted medical decision making.

Classification Decision Making +2

Modeling Point Clouds with Self-Attention and Gumbel Subset Sampling

no code implementations CVPR 2019 Jiancheng Yang, Qiang Zhang, Bingbing Ni, Linguo Li, Jinxian Liu, Mengdie Zhou, Qi Tian

Thereby, we for the first time propose an end-to-end learnable and task-agnostic sampling operation, named Gumbel Subset Sampling (GSS), to select a representative subset of input points.

Cannot find the paper you are looking for? You can Submit a new open access paper.