Search Results for author: Diogo Luvizon

Found 10 papers, 4 papers with code

EventEgo3D: 3D Human Motion Capture from Egocentric Event Streams

2 code implementations12 Apr 2024 Christen Millerdurai, Hiroyasu Akada, Jian Wang, Diogo Luvizon, Christian Theobalt, Vladislav Golyanik

In response to the existing limitations, this paper 1) introduces a new problem, i. e., 3D human motion capture from an egocentric monocular event camera with a fisheye lens, and 2) proposes the first approach to it called EventEgo3D (EE3D).

3D Pose Estimation 3D Reconstruction +1

3D Pose Estimation of Two Interacting Hands from a Monocular Event Camera

1 code implementation21 Dec 2023 Christen Millerdurai, Diogo Luvizon, Viktor Rudnev, André Jonas, Jiayi Wang, Christian Theobalt, Vladislav Golyanik

3D hand tracking from a monocular video is a very challenging problem due to hand interactions, occlusions, left-right hand ambiguity, and fast motion.

3D Pose Estimation 3D Reconstruction

Relightable Neural Actor with Intrinsic Decomposition and Pose Control

no code implementations18 Dec 2023 Diogo Luvizon, Vladislav Golyanik, Adam Kortylewski, Marc Habermann, Christian Theobalt

Creating a digital human avatar that is relightable, drivable, and photorealistic is a challenging and important problem in Vision and Graphics.

Egocentric Whole-Body Motion Capture with FisheyeViT and Diffusion-Based Motion Refinement

no code implementations28 Nov 2023 Jian Wang, Zhe Cao, Diogo Luvizon, Lingjie Liu, Kripasindhu Sarkar, Danhang Tang, Thabo Beeler, Christian Theobalt

In this work, we explore egocentric whole-body motion capture using a single fisheye camera, which simultaneously estimates human body and hand motion.

 Ranked #1 on Egocentric Pose Estimation on GlobalEgoMocap Test Dataset (using extra training data)

Egocentric Pose Estimation Hand Detection +2

Scene-Aware 3D Multi-Human Motion Capture from a Single Camera

1 code implementation12 Jan 2023 Diogo Luvizon, Marc Habermann, Vladislav Golyanik, Adam Kortylewski, Christian Theobalt

In this work, we consider the problem of estimating the 3D position of multiple humans in a scene as well as their body shape and articulation from a single RGB video recorded with a static camera.

Position

Scene-aware Egocentric 3D Human Pose Estimation

1 code implementation CVPR 2023 Jian Wang, Lingjie Liu, Weipeng Xu, Kripasindhu Sarkar, Diogo Luvizon, Christian Theobalt

To this end, we propose an egocentric depth estimation network to predict the scene depth map from a wide-view egocentric fisheye camera while mitigating the occlusion of the human body with a depth-inpainting network.

Ranked #3 on Egocentric Pose Estimation on GlobalEgoMocap Test Dataset (using extra training data)

Depth Estimation Egocentric Pose Estimation

HandFlow: Quantifying View-Dependent 3D Ambiguity in Two-Hand Reconstruction with Normalizing Flow

no code implementations4 Oct 2022 Jiayi Wang, Diogo Luvizon, Franziska Mueller, Florian Bernard, Adam Kortylewski, Dan Casas, Christian Theobalt

Through this, we demonstrate the quality of our probabilistic reconstruction and show that explicit ambiguity modeling is better-suited for this challenging problem.

valid

Estimating Egocentric 3D Human Pose in the Wild with External Weak Supervision

no code implementations CVPR 2022 Jian Wang, Lingjie Liu, Weipeng Xu, Kripasindhu Sarkar, Diogo Luvizon, Christian Theobalt

Specifically, we first generate pseudo labels for the EgoPW dataset with a spatio-temporal optimization method by incorporating the external-view supervision.

Ranked #4 on Egocentric Pose Estimation on GlobalEgoMocap Test Dataset (using extra training data)

Egocentric Pose Estimation

SSP-Net: Scalable Sequential Pyramid Networks for Real-Time 3D Human Pose Regression

no code implementations4 Sep 2020 Diogo Luvizon, Hedi Tabia, David Picard

In this paper we propose a highly scalable convolutional neural network, end-to-end trainable, for real-time 3D human pose regression from still RGB images.

3D Human Pose Estimation 3D Pose Estimation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.