SPU-PMD: Self-Supervised Point Cloud Upsampling via Progressive Mesh Deformation

CVPR 2024  ·  Yanzhe Liu, Rong Chen, Yushi Li, Yixi Li, Xuehou Tan ·

Despite the success of recent upsampling approaches generating high-resolution point sets with uniform distribution and meticulous structures is still challenging. Unlike existing methods that only take spatial information of the raw data into account we regard point cloud upsampling as generating dense point clouds from deformable topology. Motivated by this we present SPU-PMD a self-supervised topological mesh deformation network for 3D densification. As a cascaded framework our architecture is formulated by a series of coarse mesh interpolator and mesh deformers. At each stage the mesh interpolator first produces the initial dense point clouds via mesh interpolation which allows the model to perceive the primitive topology better. Meanwhile the deformer infers the morphing by estimating the movements of mesh nodes and reconstructs the descriptive topology structure. By associating mesh deformation with feature expansion this module progressively refines point clouds' surface uniformity and structural details. To demonstrate the effectiveness of the proposed method extensive quantitative and qualitative experiments are conducted on synthetic and real-scanned 3D data. Also we compare it with state-of-the-art techniques to further illustrate the superiority of our network. The project page is: https://github.com/lyz21/SPU-PMD

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here