Search Results for author: Takuya Kiyokawa

Found 3 papers, 0 papers with code

Bounding Box Annotation with Visible Status

no code implementations11 Apr 2023 Takuya Kiyokawa, Naoki Shirakura, Hiroki Katayama, Keita Tomochika, Jun Takamatsu

However, because the previous method relied on moving the object within the capturing range using a fixed-point camera, the collected image dataset was limited in terms of capturing viewpoints.

Object

Robotic Waste Sorter with Agile Manipulation and Quickly Trainable Detector

no code implementations2 Apr 2021 Takuya Kiyokawa, Hiroki Katayama, Yuya Tatsuta, Jun Takamatsu, Tsukasa Ogasawara

Via experiments in an indoor experimental workplace for waste-sorting, we confirm that the proposed methods enable quick collection of the training image sets for three classes of waste items (i. e., aluminum can, glass bottle, and plastic bottle) and detection with higher performance than the methods that do not consider the differences.

Learning-from-Observation Framework: One-Shot Robot Teaching for Grasp-Manipulation-Release Household Operations

no code implementations4 Aug 2020 Naoki Wake, Riku Arakawa, Iori Yanokura, Takuya Kiyokawa, Kazuhiro Sasabuchi, Jun Takamatsu, Katsushi Ikeuchi

In the context of one-shot robot teaching, the contributions of the paper are: 1) to propose a framework that 1) covers various tasks in grasp-manipulation-release class household operations and 2) mimics human postures during the operations.

Robotics Human-Computer Interaction

Cannot find the paper you are looking for? You can Submit a new open access paper.