Search Results for author: Patrick Yin

Found 5 papers, 1 papers with code

ASID: Active Exploration for System Identification in Robotic Manipulation

no code implementations18 Apr 2024 Marius Memmel, Andrew Wagenmaker, Chuning Zhu, Patrick Yin, Dieter Fox, Abhishek Gupta

In this work, we propose a learning system that can leverage a small amount of real-world data to autonomously refine a simulation model and then plan an accurate control strategy that can be deployed in the real world.

Stabilizing Contrastive RL: Techniques for Robotic Goal Reaching from Offline Data

1 code implementation6 Jun 2023 Chongyi Zheng, Benjamin Eysenbach, Homer Walke, Patrick Yin, Kuan Fang, Ruslan Salakhutdinov, Sergey Levine

Robotic systems that rely primarily on self-supervised learning have the potential to decrease the amount of human annotation and engineering effort required to learn control strategies.

Contrastive Learning Data Augmentation +2

Generalization with Lossy Affordances: Leveraging Broad Offline Data for Learning Visuomotor Tasks

no code implementations12 Oct 2022 Kuan Fang, Patrick Yin, Ashvin Nair, Homer Walke, Gengchen Yan, Sergey Levine

The utilization of broad datasets has proven to be crucial for generalization for a wide range of fields.

Planning to Practice: Efficient Online Fine-Tuning by Composing Goals in Latent Space

no code implementations17 May 2022 Kuan Fang, Patrick Yin, Ashvin Nair, Sergey Levine

Our experimental results show that PTP can generate feasible sequences of subgoals that enable the policy to efficiently solve the target tasks.

reinforcement-learning Reinforcement Learning (RL)

Bisimulation Makes Analogies in Goal-Conditioned Reinforcement Learning

no code implementations27 Apr 2022 Philippe Hansen-Estruch, Amy Zhang, Ashvin Nair, Patrick Yin, Sergey Levine

We learn this representation using a metric form of this abstraction, and show its ability to generalize to new goals in simulation manipulation tasks.

reinforcement-learning Reinforcement Learning (RL)

Cannot find the paper you are looking for? You can Submit a new open access paper.