A Neuro-Symbolic Humanlike Arm Controller for Sophia the Robot

27 Oct 2020  ·  David Hanson, Alishba Imran, Abhinandan Vellanki, Sanjeew Kanagaraj ·

We outline the design and construction of novel robotic arms using machine perception, convolutional neural networks, and symbolic AI for logical control and affordance indexing. We describe our robotic arms built with a humanlike mechanical configuration and aesthetic, with 28 degrees of freedom, touch sensors, and series elastic actuators. The arms were modelled in Roodle and Gazebo with URDF models, as well as Unity, and implement motion control solutions for solving live games of Baccarat (the casino card game), rock paper scissors, handshaking, and drawing. This includes live interactions with people, incorporating both social control of the hands and facial gestures, and physical inverse kinematics (IK) for grasping and manipulation tasks. The resulting framework is an integral part of the Sophia 2020 alpha platform, which is being used with ongoing research in the authors work with team AHAM, an ANA Avatar Xprize effort towards human-AI hybrid telepresence. These results are available to test on the broadly released Hanson Robotics Sophia 2020 robot platform, for users to try and extend.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here