Search Results for author: Walterio W. Mayol-Cuevas

Found 4 papers, 0 papers with code

Rebellion and Disobedience as Useful Tools in Human-Robot Interaction Research -- The Handheld Robotics Case

no code implementations8 May 2022 Walterio W. Mayol-Cuevas

This position paper argues on the utility of rebellion and disobedience (RaD) in human-robot interaction (HRI).

Position

The Object at Hand: Automated Editing for Mixed Reality Video Guidance from Hand-Object Interactions

no code implementations29 Sep 2021 Yao Lu, Walterio W. Mayol-Cuevas

In this paper, we concern with the problem of how to automatically extract the steps that compose real-life hand activities.

Mixed Reality Object

Egocentric Hand-object Interaction Detection and Application

no code implementations29 Sep 2021 Yao Lu, Walterio W. Mayol-Cuevas

We compare our method with the most recent work from Shan et al. \cite{Shan20} on selected images from EPIC-KITCHENS \cite{damen2018scaling} dataset and achieve $89\%$ accuracy on HOI (hand-object interaction) detection which is comparative to Shan's ($92\%$).

Object

Understanding Egocentric Hand-Object Interactions from Hand Pose Estimation

no code implementations29 Sep 2021 Yao Lu, Walterio W. Mayol-Cuevas

To show the ability to preserve the semantic information of our method, we also report the performance of grasp type classification on GUN-71 dataset and outperforms the benchmark by only using the predicted 3-d hand pose.

Hand Pose Estimation Object

Cannot find the paper you are looking for? You can Submit a new open access paper.