Search Results for author: Tyler M. Tomita

Found 5 papers, 3 papers with code

Deep Discriminative to Kernel Density Networks for Calibrated Inference

1 code implementation31 Jan 2022 Jayanta Dey, Will LeVine, Haoyin Xu, Ashwin De Silva, Tyler M. Tomita, Ali Geisa, Tiffany Chu, Jacob Desman, Joshua T. Vogelstein

In this paper, we leveraged the fact that deep models, including both random forests and deep-nets, learn internal representations which are unions of polytopes with affine activation functions to conceptualize them both as partitioning rules of the feature space.

Out-of-Distribution Detection regression

Robust Similarity and Distance Learning via Decision Forests

no code implementations27 Jul 2020 Tyler M. Tomita, Joshua T. Vogelstein

Many algorithms have been proposed for automated learning of suitable distances, most of which employ linear methods to learn a global metric over the feature space.

regression

Manifold Oblique Random Forests: Towards Closing the Gap on Convolutional Deep Networks

1 code implementation25 Sep 2019 Adam Li, Ronan Perry, Chester Huynh, Tyler M. Tomita, Ronak Mehta, Jesus Arroyo, Jesse Patsolic, Benjamin Falk, Joshua T. Vogelstein

In particular, Forests dominate other methods in tabular data, that is, when the feature space is unstructured, so that the signal is invariant to a permutation of the feature indices.

EEG Electroencephalogram (EEG) +2

MANIFOLD FORESTS: CLOSING THE GAP ON NEURAL NETWORKS

no code implementations25 Sep 2019 Ronan Perry, Tyler M. Tomita, Jesse Patsolic, Benjamin Falk, Joshua Vogelstein

In particular, DFs dominate other methods in tabular data, that is, when the feature space is unstructured, so that the signal is invariant to permuting feature indices.

Image Classification Time Series Analysis

Sparse Projection Oblique Randomer Forests

2 code implementations10 Jun 2015 Tyler M. Tomita, James Browne, Cencheng Shen, Jaewon Chung, Jesse L. Patsolic, Benjamin Falk, Jason Yim, Carey E. Priebe, Randal Burns, Mauro Maggioni, Joshua T. Vogelstein

Unfortunately, these extensions forfeit one or more of the favorable properties of decision forests based on axis-aligned splits, such as robustness to many noise dimensions, interpretability, or computational efficiency.

Computational Efficiency

Cannot find the paper you are looking for? You can Submit a new open access paper.