Search Results for author: Kailin Li

Found 6 papers, 5 papers with code

CHORD: Category-level Hand-held Object Reconstruction via Shape Deformation

no code implementations ICCV 2023 Kailin Li, Lixin Yang, Haoyu Zhen, Zenan Lin, Xinyu Zhan, Licheng Zhong, Jian Xu, Kejian Wu, Cewu Lu

This can be attributed to the fact that humans have mastered the shape prior of the 'mug' category, and can quickly establish the corresponding relations between different mug instances and the prior, such as where the rim and handle are located.

Object Reconstruction

Color-NeuS: Reconstructing Neural Implicit Surfaces with Color

1 code implementation14 Aug 2023 Licheng Zhong, Lixin Yang, Kailin Li, Haoyu Zhen, Mei Han, Cewu Lu

Mesh is extracted from the signed distance function (SDF) network for the surface, and color for each surface vertex is drawn from the global color network.

DART: Articulated Hand Model with Diverse Accessories and Rich Textures

1 code implementation14 Oct 2022 Daiheng Gao, Yuliang Xiu, Kailin Li, Lixin Yang, Feng Wang, Peng Zhang, Bang Zhang, Cewu Lu, Ping Tan

Unity GUI is also provided to generate synthetic hand data with user-defined settings, e. g., pose, camera, background, lighting, textures, and accessories.

Hand Pose Estimation Unity

OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction

1 code implementation CVPR 2022 Lixin Yang, Kailin Li, Xinyu Zhan, Fei Wu, Anran Xu, Liu Liu, Cewu Lu

We start to collect 1, 800 common household objects and annotate their affordances to construct the first knowledge base: Oak.

Grasp Generation Pose Estimation

CPF: Learning a Contact Potential Field to Model the Hand-Object Interaction

1 code implementation ICCV 2021 Lixin Yang, Xinyu Zhan, Kailin Li, Wenqiang Xu, Jiefeng Li, Cewu Lu

In this paper, we present an explicit contact representation namely Contact Potential Field (CPF), and a learning-fitting hybrid framework namely MIHO to Modeling the Interaction of Hand and Object.

Pose Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.