Search Results for author: Thomas Kollar

Found 10 papers, 4 papers with code

Language-Driven Representation Learning for Robotics

2 code implementations24 Feb 2023 Siddharth Karamcheti, Suraj Nair, Annie S. Chen, Thomas Kollar, Chelsea Finn, Dorsa Sadigh, Percy Liang

First, we demonstrate that existing representations yield inconsistent results across these tasks: masked autoencoding approaches pick up on low-level spatial features at the cost of high-level semantics, while contrastive learning approaches capture the opposite.

Contrastive Learning Imitation Learning +2

SimNet: Enabling Robust Unknown Object Manipulation from Pure Synthetic Data via Stereo

1 code implementation30 Jun 2021 Thomas Kollar, Michael Laskey, Kevin Stone, Brijen Thananjeyan, Mark Tjersland

However, the RGB-D baseline only grasps 35% of the hard (e. g., transparent) objects, while SimNet grasps 95%, suggesting that SimNet can enable robust manipulation of unknown objects, including transparent objects, in unknown environments.

Keypoint Detection object-detection +4

A Mobile Manipulation System for One-Shot Teaching of Complex Tasks in Homes

no code implementations30 Sep 2019 Max Bajracharya, James Borders, Dan Helmick, Thomas Kollar, Michael Laskey, John Leichty, Jeremy Ma, Umashankar Nagarajan, Akiyoshi Ochiai, Josh Petersen, Krishna Shankar, Kevin Stone, Yutaka Takaoka

We describe a mobile manipulation hardware and software system capable of autonomously performing complex human-level tasks in real homes, after being taught the task with a single demonstration from a person in virtual reality.

Jointly Learning to Parse and Perceive: Connecting Natural Language to the Physical World

no code implementations TACL 2013 Jayant Krishnamurthy, Thomas Kollar

LSP learns physical representations for both categorical ({``}blue,{''} {``}mug{''}) and relational ({``}on{''}) language, and also learns to compose these representations to produce the referents of entire statements.

Language Acquisition Question Answering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.