To combat this, our approach uses a simple yet effective rule-based fallback layer that performs sanity checks on an ML planner's decisions (e. g. avoiding collision, assuring physical feasibility).
Despite the numerous successes of machine learning over the past decade (image recognition, decision-making, NLP, image synthesis), self-driving technology has not yet followed the same trend.
Motivated by the impact of large-scale datasets on ML systems we present the largest self-driving dataset for motion prediction to date, containing over 1, 000 hours of data.
We present PointFusion, a generic 3D object detection method that leverages both image and 3D point cloud information.
We consider the problem of learning preferences over trajectories for mobile manipulators such as personal robots and assembly line robots.
We introduce a diverse data set with 1180 miles of natural freeway and city driving, and show that we can anticipate maneuvers 3. 5 seconds before they occur in real-time with a precision and recall of 90. 5\% and 87. 4\% respectively.
The proposed method is generic and principled as it can be used for transforming any spatio-temporal graph through employing a certain set of well defined steps.
Ranked #4 on Skeleton Based Action Recognition on CAD-120
We introduce a sensory-fusion architecture which jointly learns to anticipate and fuse information from multiple sensory streams.
We evaluate our approach on a diverse data set with 1180 miles of natural freeway and city driving and show that we can anticipate maneuvers 3. 5 seconds before they occur with over 80\% F1-score in real-time.
In this paper we introduce a knowledge engine, which learns and shares knowledge representations, for robots to carry out a variety of tasks.
We represent trajectory preferences using a cost function that the robot learns and uses it to generate good trajectories in new environments.
In this paper, we propose a co-active online learning framework for teaching robots the preferences of its users for object manipulation tasks.