Search Results for author: Kyle Vedder

Found 6 papers, 3 papers with code

Articulate-Anything: Automatic Modeling of Articulated Objects via a Vision-Language Foundation Model

no code implementations3 Oct 2024 Long Le, Jason Xie, William Liang, Hung-Ju Wang, Yue Yang, Yecheng Jason Ma, Kyle Vedder, Arjun Krishna, Dinesh Jayaraman, Eric Eaton

Interactive 3D simulated objects are crucial in AR/VR, animations, and robotics, driving immersive experiences and advanced automation.

Neural Eulerian Scene Flow Fields

no code implementations2 Oct 2024 Kyle Vedder, Neehar Peri, Ishan Khatri, Siyi Li, Eric Eaton, Mehmet Kocamaz, Yue Wang, Zhiding Yu, Deva Ramanan, Joachim Pehserl

We reframe scene flow as the task of estimating a continuous space-time ODE that describes motion for an entire observation sequence, represented with a neural prior.

Autonomous Driving Point Tracking +1

I Can't Believe It's Not Scene Flow!

2 code implementations7 Mar 2024 Ishan Khatri, Kyle Vedder, Neehar Peri, Deva Ramanan, James Hays

Current scene flow methods broadly fail to describe motion on small objects, and current scene flow evaluation protocols hide this failure by averaging over many points, with most drawn larger objects.

ZeroFlow: Scalable Scene Flow via Distillation

1 code implementation17 May 2023 Kyle Vedder, Neehar Peri, Nathaniel Chodosh, Ishan Khatri, Eric Eaton, Dinesh Jayaraman, Yang Liu, Deva Ramanan, James Hays

Scene flow estimation is the task of describing the 3D motion field between temporally successive point clouds.

Ranked #2 on Self-supervised Scene Flow Estimation on Argoverse 2 (using extra training data)

Self-supervised Scene Flow Estimation

Sparse PointPillars: Maintaining and Exploiting Input Sparsity to Improve Runtime on Embedded Systems

3 code implementations12 Jun 2021 Kyle Vedder, Eric Eaton

Bird's Eye View (BEV) is a popular representation for processing 3D point clouds, and by its nature is fundamentally sparse.

Birds Eye View Object Detection Object Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.