Search Results for author: Katherine J. Kuchenbecker

Found 7 papers, 2 papers with code

Reconstructing Signing Avatars From Video Using Linguistic Priors

no code implementations CVPR 2023 Maria-Paola Forte, Peter Kulits, Chun-Hao Huang, Vasileios Choutas, Dimitrios Tzionas, Katherine J. Kuchenbecker, Michael J. Black

A perceptual study shows that SGNify's 3D reconstructions are significantly more comprehensible and natural than those of previous methods and are on par with the source videos.

Multimodal Multi-User Surface Recognition with the Kernel Two-Sample Test

1 code implementation8 Mar 2023 Behnam Khojasteh, Friedrich Solowjow, Sebastian Trimpe, Katherine J. Kuchenbecker

Machine learning and deep learning have been used extensively to classify physical surfaces through images and time-series contact data.

Benchmarking Time Series +3

Predicting knee adduction moment response to gait retraining with minimal clinical data

1 code implementation Plos Computational Biology 2022 Nataliya Rokhmanova, Katherine J. Kuchenbecker, Peter B. Shull, Reed Ferber, Eni Halilaj

Insights learned from a ground-truth dataset with both baseline and toe-in gait trials (N = 12) enabled the creation of a large (N = 138) synthetic dataset for training the predictive model.

A soft thumb-sized vision-based sensor with accurate all-round force perception

no code implementations10 Nov 2021 Huanbo Sun, Katherine J. Kuchenbecker, Georg Martius

Insight has an overall spatial resolution of 0. 4 mm, force magnitude accuracy around 0. 03 N, and force direction accuracy around 5 degrees over a range of 0. 03--2 N for numerous distinct contacts with varying contact area.

The Six Hug Commandments: Design and Evaluation of a Human-Sized Hugging Robot with Visual and Haptic Perception

no code implementations19 Jan 2021 Alexis E. Block, Sammy Christen, Roger Gassert, Otmar Hilliges, Katherine J. Kuchenbecker

We followed all six tenets to create a new robotic platform, HuggieBot 2. 0, that has a soft, warm, inflated body (HuggieChest) and uses visual and haptic sensing to deliver closed-loop hugging.

Robotics

Deep Learning for Tactile Understanding From Visual and Haptic Data

no code implementations19 Nov 2015 Yang Gao, Lisa Anne Hendricks, Katherine J. Kuchenbecker, Trevor Darrell

Robots which interact with the physical world will benefit from a fine-grained tactile understanding of objects and surfaces.

Cannot find the paper you are looking for? You can Submit a new open access paper.