no code implementations • 7 Feb 2024 • Adel Javanmard, Matthew Fahrbach, Vahab Mirrokni
This work studies algorithms for learning from aggregate responses.
no code implementations • 10 Nov 2023 • Kyriakos Axiotis, Sami Abu-al-haija, Lin Chen, Matthew Fahrbach, Gang Fu
We demonstrate the success of Greedy PIG on a wide variety of tasks, including image feature attribution, graph compression/explanation, and post-hoc feature selection on tabular data.
no code implementations • 7 Nov 2023 • Aaron Archer, Matthew Fahrbach, Kuikui Liu, Prakash Prabhu
We optimize pipeline parallelism for deep neural network (DNN) inference by partitioning model graphs into $k$ stages and minimizing the running time of the bottleneck stage, including communication.
no code implementations • NeurIPS 2023 • Benjamin Coleman, Wang-Cheng Kang, Matthew Fahrbach, Ruoxi Wang, Lichan Hong, Ed H. Chi, Derek Zhiyuan Cheng
Learning high-quality feature embeddings efficiently and effectively is critical for the performance of web-scale machine learning systems.
1 code implementation • 27 Mar 2023 • Matthew Fahrbach, Adel Javanmard, Vahab Mirrokni, Pratik Worah
We design learning rate schedules that minimize regret for SGD-based online learning in the presence of a changing data distribution.
no code implementations • 8 Feb 2023 • Mehrdad Ghadiri, Matthew Fahrbach, Gang Fu, Vahab Mirrokni
This work studies the combinatorial optimization problem of finding an optimal core tensor shape, also called multilinear rank, for a size-constrained Tucker decomposition.
1 code implementation • 29 Sep 2022 • Taisuke Yasuda, Mohammadhossein Bateni, Lin Chen, Matthew Fahrbach, Gang Fu, Vahab Mirrokni
Feature selection is the problem of selecting a subset of features for a machine learning model that maximizes model quality subject to a budget constraint.
1 code implementation • 11 Sep 2022 • Matthew Fahrbach, Thomas Fu, Mehrdad Ghadiri
By extending our approach to block-design matrices where one block is a Kronecker product, we also achieve subquadratic-time algorithms for (1) Kronecker ridge regression and (2) updating the factor matrices of a Tucker decomposition in ALS, which is not a pure Kronecker regression problem, thereby improving the running time of all steps of Tucker ALS.
no code implementations • 22 Jul 2021 • Matthew Fahrbach, Mehrdad Ghadiri, Thomas Fu
Low-rank tensor decomposition generalizes low-rank matrix approximation and is a powerful technique for discovering low-dimensional structure in high-dimensional data.
1 code implementation • ICML 2020 • Matthew Fahrbach, Gramoz Goranci, Richard Peng, Sushant Sachdeva, Chi Wang
As computing Schur complements is expensive, we give a nearly-linear time algorithm that generates a coarsened graph on the relevant vertices that provably matches the Schur complement in expectation in each iteration.
2 code implementations • 5 May 2020 • Matthew Fahrbach, Zhiyi Huang, Runzhou Tao, Morteza Zadimoghaddam
Online bipartite matching and its variants are among the most fundamental problems in the online algorithms literature.
Data Structures and Algorithms Computer Science and Game Theory
no code implementations • 2 Apr 2019 • Matthew Fahrbach, Dana Randall
We analyze the Glauber dynamics for the six-vertex model with free boundary conditions in the antiferroelectric phase and significantly extend the region for which local Markov chains are known to be slow mixing.
Data Structures and Algorithms Mathematical Physics Mathematical Physics Probability