Search Results for author: Tony Martinez

Found 20 papers, 2 papers with code

Ego2HandsPose: A Dataset for Egocentric Two-hand 3D Global Pose Estimation

no code implementations10 Jun 2022 Fanqing Lin, Tony Martinez

The proposed composition-based data generation technique can create two-hand instances with quality, quantity and diversity that generalize well to unseen domains.

3D Pose Estimation Hand Segmentation

Generalizing Interactive Backpropagating Refinement for Dense Prediction Networks

no code implementations CVPR 2022 Fanqing Lin, Brian Price, Tony Martinez

Recently, feature backpropagating refinement scheme (f-BRS) has been proposed for the task of interactive segmentation, which enables efficient optimization of a small set of auxiliary variables inserted into the pretrained network to produce object segmentation that better aligns with user inputs.

Image Matting Interactive Segmentation +2

TRACE: A Differentiable Approach to Line-level Stroke Recovery for Offline Handwritten Text

no code implementations24 May 2021 Taylor Archibald, Mason Poggemann, Aaron Chan, Tony Martinez

We demonstrate that temporal stroke information recovered by TRACE from offline data can be used for handwriting synthesis and establish the first benchmarks for a stroke trajectory recovery system trained on the IAM online handwriting dataset.

Dynamic Time Warping Handwriting Recognition

Ego2Hands: A Dataset for Egocentric Two-hand Segmentation and Detection

1 code implementation14 Nov 2020 Fanqing Lin, Tony Martinez

Hand segmentation and detection in truly unconstrained RGB-based settings is important for many applications.

Domain Adaptation Hand Segmentation

Two-hand Global 3D Pose Estimation Using Monocular RGB

no code implementations1 Jun 2020 Fanqing Lin, Connor Wilhelm, Tony Martinez

We tackle the challenging task of estimating global 3D joint locations for both hands via only monocular RGB input images.

3D Canonical Hand Pose Estimation 3D Pose Estimation

Language Model Supervision for Handwriting Recognition Model Adaptation

no code implementations4 Aug 2018 Chris Tensmeyer, Curtis Wigington, Brian Davis, Seth Stewart, Tony Martinez, William Barrett

Training state-of-the-art offline handwriting recognition (HWR) models requires large labeled datasets, but unfortunately such datasets are not available in all languages and domains due to the high cost of manual labeling. We address this problem by showing how high resource languages can be leveraged to help train models for low resource languages. We propose a transfer learning methodology where we adapt HWR models trained on a source language to a target language that uses the same writing script. This methodology only requires labeled data in the source language, unlabeled data in the target language, and a language model of the target language.

Handwriting Recognition Language Modelling +1

Convolutional Neural Networks for Font Classification

no code implementations11 Aug 2017 Chris Tensmeyer, Daniel Saunders, Tony Martinez

This same method also achieves the highest reported accuracy of 86. 6% in predicting paleographic scribal script classes at the page level on medieval Latin manuscripts.

Classification Data Augmentation +2

Document Image Binarization with Fully Convolutional Neural Networks

no code implementations10 Aug 2017 Chris Tensmeyer, Tony Martinez

Binarization of degraded historical manuscript images is an important pre-processing step for many document processing tasks.

Binarization General Classification

A Hierarchical Multi-Output Nearest Neighbor Model for Multi-Output Dependence Learning

no code implementations17 Oct 2014 Richard G. Morris, Tony Martinez, Michael R. Smith

Multi-Output Dependence (MOD) learning is a generalization of standard classification problems that allows for multiple outputs that are dependent on each other.

General Classification

Recommending Learning Algorithms and Their Associated Hyperparameters

no code implementations7 Jul 2014 Michael R. Smith, Logan Mitchell, Christophe Giraud-Carrier, Tony Martinez

The success of machine learning on a given task dependson, among other things, which learning algorithm is selected and its associated hyperparameters.

Collaborative Filtering Meta-Learning

A Hybrid Latent Variable Neural Network Model for Item Recommendation

no code implementations9 Jun 2014 Michael R. Smith, Tony Martinez, Michael Gashler

Collaborative filtering is used to recommend items to a user without requiring a knowledge of the item itself and tends to outperform other techniques.

Collaborative Filtering

Reducing the Effects of Detrimental Instances

no code implementations9 Jun 2014 Michael R. Smith, Tony Martinez

We examine RIDL on a set of 54 data sets and 5 learning algorithms and compare RIDL with other weighting and filtering approaches.

The Potential Benefits of Filtering Versus Hyper-Parameter Optimization

no code implementations13 Mar 2014 Michael R. Smith, Tony Martinez, Christophe Giraud-Carrier

We find that, while both significantly improve the induced model, improving the quality of the training set has a greater potential effect than hyper-parameter optimization.

Becoming More Robust to Label Noise with Classifier Diversity

no code implementations7 Mar 2014 Michael R. Smith, Tony Martinez

We compare NICD with several other noise handling techniques that do not consider classifier diversity on a set of 54 data sets and 5 learning algorithms.

Missing Value Imputation With Unsupervised Backpropagation

no code implementations19 Dec 2013 Michael S. Gashler, Michael R. Smith, Richard Morris, Tony Martinez

We evaluate UBP with the task of imputing missing values in datasets, and show that UBP is able to predict missing values with significantly lower sum-squared error than other collaborative filtering and imputation techniques.

Collaborative Filtering General Classification +1

An Extensive Evaluation of Filtering Misclassified Instances in Supervised Classification Tasks

no code implementations13 Dec 2013 Michael R. Smith, Tony Martinez

Additionally, we find that a majority voting ensemble is robust to noise as filtering with a voting ensemble does not increase the classification accuracy of the voting ensemble.

General Classification

Quantum Associative Memory

2 code implementations18 Jul 1998 Dan Ventura, Tony Martinez

The unique characteristics of quantum theory may also be used to create a quantum associative memory with a capacity exponential in the number of neurons.

Quantum Physics

Cannot find the paper you are looking for? You can Submit a new open access paper.