Search Results for author: Mariko Isogawa

Found 5 papers, 1 papers with code

Scapegoat Generation for Privacy Protection from Deepfake

no code implementations6 Mar 2023 Gido Kato, Yoshihiro Fukuhara, Mariko Isogawa, Hideki Tsunashima, Hirokatsu Kataoka, Shigeo Morishima

To protect privacy and prevent malicious use of deepfake, current studies propose methods that interfere with the generation process, such as detection and destruction approaches.

Face Swapping

Listening Human Behavior: 3D Human Pose Estimation With Acoustic Signals

no code implementations CVPR 2023 Yuto Shibata, Yutaka Kawashima, Mariko Isogawa, Go Irie, Akisato Kimura, Yoshimitsu Aoki

Aiming to capture subtle sound changes to reveal detailed pose information, we explicitly extract phase features from the acoustic signals together with typical spectrum features and feed them into our human pose estimation network.

3D Human Pose Estimation

Bilateral Video Magnification Filter

no code implementations CVPR 2022 Shoichiro Takeda, Kenta Niwa, Mariko Isogawa, Shinya Shimizu, Kazuki Okami, Yushi Aono

Eulerian video magnification (EVM) has progressed to magnify subtle motions with a target frequency even under the presence of large motions of objects.

Unity

Efficient Non-Line-of-Sight Imaging from Transient Sinograms

no code implementations ECCV 2020 Mariko Isogawa, Dorian Chan, Ye Yuan, Kris Kitani, Matthew O'Toole

Non-line-of-sight (NLOS) imaging techniques use light that diffusely reflects off of visible surfaces (e. g., walls) to see around corners.

Optical Non-Line-of-Sight Physics-based 3D Human Pose Estimation

1 code implementation CVPR 2020 Mariko Isogawa, Ye Yuan, Matthew O'Toole, Kris Kitani

We bring together a diverse set of technologies from NLOS imaging, human pose estimation and deep reinforcement learning to construct an end-to-end data processing pipeline that converts a raw stream of photon measurements into a full 3D human pose sequence estimate.

3D Human Pose Estimation Humanoid Control +1

Cannot find the paper you are looking for? You can Submit a new open access paper.