Search Results for author: Ryo Hachiuma

Found 7 papers, 1 papers with code

A Two-Block RNN-based Trajectory Prediction from Incomplete Trajectory

no code implementations14 Mar 2022 Ryo Fujii, Jayakorn Vongkulbhisal, Ryo Hachiuma, Hideo Saito

However, most works rely on a key assumption that each video is successfully preprocessed by detection and tracking algorithms and the complete observed trajectory is always available.

Imputation Trajectory Prediction

RGB-D Image Inpainting Using Generative Adversarial Network with a Late Fusion Approach

no code implementations14 Oct 2021 Ryo Fujii, Ryo Hachiuma, Hideo Saito

We expand conventional image inpainting method to RGB-D image inpainting to jointly restore the texture and geometry of missing regions from a pair of RGB and depth images.

Image Inpainting Object Detection +1

Dynamics-Regulated Kinematic Policy for Egocentric Pose Estimation

1 code implementation NeurIPS 2021 Zhengyi Luo, Ryo Hachiuma, Ye Yuan, Kris Kitani

By comparing the pose instructed by the kinematic model against the pose generated by the dynamics model, we can use their misalignment to further improve the kinematic model.

Egocentric Pose Estimation Human-Object Interaction Detection +1

Kinematics-Guided Reinforcement Learning for Object-Aware 3D Ego-Pose Estimation

no code implementations10 Nov 2020 Zhengyi Luo, Ryo Hachiuma, Ye Yuan, Shun Iwase, Kris M. Kitani

We propose a method for incorporating object interaction and human body dynamics into the task of 3D ego-pose estimation using a head-mounted camera.

Human-Object Interaction Detection Pose Estimation +1

DetectFusion: Detecting and Segmenting Both Known and Unknown Dynamic Objects in Real-time SLAM

no code implementations22 Jul 2019 Ryo Hachiuma, Christian Pirchheim, Dieter Schmalstieg, Hideo Saito

We present DetectFusion, an RGB-D SLAM system that runs in real-time and can robustly handle semantically known and unknown objects that can move dynamically in the scene.

2D object detection Frame +4

Cannot find the paper you are looking for? You can Submit a new open access paper.