Search Results for author: Thomas Fu

Found 4 papers, 1 papers with code

Subquadratic Kronecker Regression with Applications to Tensor Decomposition

1 code implementation11 Sep 2022 Matthew Fahrbach, Thomas Fu, Mehrdad Ghadiri

By extending our approach to block-design matrices where one block is a Kronecker product, we also achieve subquadratic-time algorithms for (1) Kronecker ridge regression and (2) updating the factor matrices of a Tucker decomposition in ALS, which is not a pure Kronecker regression problem, thereby improving the running time of all steps of Tucker ALS.

regression Tensor Decomposition

Fast Low-Rank Tensor Decomposition by Ridge Leverage Score Sampling

no code implementations22 Jul 2021 Matthew Fahrbach, Mehrdad Ghadiri, Thomas Fu

Low-rank tensor decomposition generalizes low-rank matrix approximation and is a powerful technique for discovering low-dimensional structure in high-dimensional data.

regression Tensor Decomposition

Locality-Sensitive Hashing for f-Divergences: Mutual Information Loss and Beyond

no code implementations NeurIPS 2019 Lin Chen, Hossein Esfandiari, Thomas Fu, Vahab S. Mirrokni

In this paper, we aim to develop LSH schemes for distance functions that measure the distance between two probability distributions, particularly for f-divergences as well as a generalization to capture mutual information loss.

Model Compression

Categorical Feature Compression via Submodular Optimization

no code implementations30 Apr 2019 Mohammadhossein Bateni, Lin Chen, Hossein Esfandiari, Thomas Fu, Vahab S. Mirrokni, Afshin Rostamizadeh

To achieve this, we introduce a novel re-parametrization of the mutual information objective, which we prove is submodular, and design a data structure to query the submodular function in amortized $O(\log n )$ time (where $n$ is the input vocabulary size).

Feature Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.