Search Results for author: Hiroshi Ito

Found 4 papers, 0 papers with code

Visual Spatial Attention and Proprioceptive Data-Driven Reinforcement Learning for Robust Peg-in-Hole Task Under Variable Conditions

no code implementations27 Dec 2023 André Yuji Yasutomi, Hideyuki Ichiwara, Hiroshi Ito, Hiroki Mori, Tetsuya OGATA

In this study, we introduce a vision and proprioceptive data-driven robot control model for this task that is robust to challenging lighting and hole surface conditions.

Realtime Motion Generation with Active Perception Using Attention Mechanism for Cooking Robot

no code implementations26 Sep 2023 Namiko Saito, Mayu Hiramoto, Ayuna Kubo, Kanata Suzuki, Hiroshi Ito, Shigeki SUGANO, Tetsuya OGATA

We tackled on the task of cooking scrambled eggs using real ingredients, in which the robot needs to perceive the states of the egg and adjust stirring movement in real time, while the egg is heated and the state changes continuously.

Deep Active Visual Attention for Real-time Robot Motion Generation: Emergence of Tool-body Assimilation and Adaptive Tool-use

no code implementations29 Jun 2022 Hyogo Hiruma, Hiroshi Ito, Hiroki Mori, Tetsuya OGATA

The model incorporates a state-driven active top-down visual attention module, which acquires attentions that can actively change targets based on task states.

Enhanced precision of circadian rhythm by output system

no code implementations2 Jun 2022 Hotaka Kaji, Fumito Mori, Hiroshi Ito

Circadian rhythms are biological rhythms of approximately 24 h that persist even under constant conditions without environmental daily cues.

Denoising

Cannot find the paper you are looking for? You can Submit a new open access paper.