Search Results for author: Matthew Fahrbach

Found 12 papers, 5 papers with code

Greedy PIG: Adaptive Integrated Gradients

no code implementations10 Nov 2023 Kyriakos Axiotis, Sami Abu-al-haija, Lin Chen, Matthew Fahrbach, Gang Fu

We demonstrate the success of Greedy PIG on a wide variety of tasks, including image feature attribution, graph compression/explanation, and post-hoc feature selection on tabular data.

feature selection

Pipeline Parallelism for DNN Inference with Practical Performance Guarantees

no code implementations7 Nov 2023 Aaron Archer, Matthew Fahrbach, Kuikui Liu, Prakash Prabhu

We optimize pipeline parallelism for deep neural network (DNN) inference by partitioning model graphs into $k$ stages and minimizing the running time of the bottleneck stage, including communication.

Learning Rate Schedules in the Presence of Distribution Shift

1 code implementation27 Mar 2023 Matthew Fahrbach, Adel Javanmard, Vahab Mirrokni, Pratik Worah

We design learning rate schedules that minimize regret for SGD-based online learning in the presence of a changing data distribution.

regression

Approximately Optimal Core Shapes for Tensor Decompositions

no code implementations8 Feb 2023 Mehrdad Ghadiri, Matthew Fahrbach, Gang Fu, Vahab Mirrokni

This work studies the combinatorial optimization problem of finding an optimal core tensor shape, also called multilinear rank, for a size-constrained Tucker decomposition.

Combinatorial Optimization

Sequential Attention for Feature Selection

1 code implementation29 Sep 2022 Taisuke Yasuda, Mohammadhossein Bateni, Lin Chen, Matthew Fahrbach, Gang Fu, Vahab Mirrokni

Feature selection is the problem of selecting a subset of features for a machine learning model that maximizes model quality subject to a budget constraint.

Feature Importance feature selection

Subquadratic Kronecker Regression with Applications to Tensor Decomposition

1 code implementation11 Sep 2022 Matthew Fahrbach, Thomas Fu, Mehrdad Ghadiri

By extending our approach to block-design matrices where one block is a Kronecker product, we also achieve subquadratic-time algorithms for (1) Kronecker ridge regression and (2) updating the factor matrices of a Tucker decomposition in ALS, which is not a pure Kronecker regression problem, thereby improving the running time of all steps of Tucker ALS.

regression Tensor Decomposition

Fast Low-Rank Tensor Decomposition by Ridge Leverage Score Sampling

no code implementations22 Jul 2021 Matthew Fahrbach, Mehrdad Ghadiri, Thomas Fu

Low-rank tensor decomposition generalizes low-rank matrix approximation and is a powerful technique for discovering low-dimensional structure in high-dimensional data.

regression Tensor Decomposition

Faster Graph Embeddings via Coarsening

1 code implementation ICML 2020 Matthew Fahrbach, Gramoz Goranci, Richard Peng, Sushant Sachdeva, Chi Wang

As computing Schur complements is expensive, we give a nearly-linear time algorithm that generates a coarsened graph on the relevant vertices that provably matches the Schur complement in expectation in each iteration.

Link Prediction Node Classification

Edge-Weighted Online Bipartite Matching

2 code implementations5 May 2020 Matthew Fahrbach, Zhiyi Huang, Runzhou Tao, Morteza Zadimoghaddam

Online bipartite matching and its variants are among the most fundamental problems in the online algorithms literature.

Data Structures and Algorithms Computer Science and Game Theory

Slow Mixing of Glauber Dynamics for the Six-Vertex Model in the Ordered Phases

no code implementations2 Apr 2019 Matthew Fahrbach, Dana Randall

We analyze the Glauber dynamics for the six-vertex model with free boundary conditions in the antiferroelectric phase and significantly extend the region for which local Markov chains are known to be slow mixing.

Data Structures and Algorithms Mathematical Physics Mathematical Physics Probability

Cannot find the paper you are looking for? You can Submit a new open access paper.