Search Results for author: Muhammed Kocabas

Found 8 papers, 6 papers with code

Learning to Regress Bodies from Images using Differentiable Semantic Rendering

1 code implementation ICCV 2021 Sai Kumar Dwivedi, Nikos Athanasiou, Muhammed Kocabas, Michael J. Black

For Minimally-Clothed regions, we define the DSR-MC loss, which encourages a tight match between a rendered SMPL body and the minimally-clothed regions of the image.

Ranked #8 on 3D Human Pose Estimation on 3DPW (using extra training data)

3D Human Pose Estimation

SPEC: Seeing People in the Wild with an Estimated Camera

1 code implementation ICCV 2021 Muhammed Kocabas, Chun-Hao P. Huang, Joachim Tesch, Lea Müller, Otmar Hilliges, Michael J. Black

We then train a novel network that concatenates the camera calibration to the image features and uses these together to regress 3D body shape and pose.

3D Multi-Person Pose Estimation

PARE: Part Attention Regressor for 3D Human Body Estimation

1 code implementation ICCV 2021 Muhammed Kocabas, Chun-Hao P. Huang, Otmar Hilliges, Michael J. Black

Despite significant progress, we show that state of the art 3D human pose and shape estimation methods remain sensitive to partial occlusion and can produce dramatically wrong predictions although much of the body is observable.

3D Multi-Person Pose Estimation

Analytical Moment Regularizer for Training Robust Networks

no code implementations ICLR 2020 Modar Alfadly, Adel Bibi, Muhammed Kocabas, Bernard Ghanem

In this work, we propose a new training regularizer that aims to minimize the probabilistic expected training loss of a DNN subject to a generic Gaussian input.

Data Augmentation

MultiPoseNet: Fast Multi-Person Pose Estimation using Pose Residual Network

4 code implementations ECCV 2018 Muhammed Kocabas, Salih Karagoz, Emre Akbas

In this paper, we present MultiPoseNet, a novel bottom-up multi-person pose estimation architecture that combines a multi-task model with a novel assignment method.

Human Detection Keypoint Detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.