Search Results for author: Michael Shavlovsky

Found 3 papers, 0 papers with code

A Sinkhorn-type Algorithm for Constrained Optimal Transport

no code implementations8 Mar 2024 Xun Tang, Holakou Rahmanian, Michael Shavlovsky, Kiran Koshy Thekumparampil, Tesi Xiao, Lexing Ying

We derive the corresponding entropy regularization formulation and introduce a Sinkhorn-type algorithm for such constrained OT problems supported by theoretical guarantees.

Scheduling

Accelerating Sinkhorn Algorithm with Sparse Newton Iterations

no code implementations20 Jan 2024 Xun Tang, Michael Shavlovsky, Holakou Rahmanian, Elisa Tardini, Kiran Koshy Thekumparampil, Tesi Xiao, Lexing Ying

To achieve possibly super-exponential convergence, we present Sinkhorn-Newton-Sparse (SNS), an extension to the Sinkhorn algorithm, by introducing early stopping for the matrix scaling steps and a second stage featuring a Newton-type subroutine.

Pretrained deep models outperform GBDTs in Learning-To-Rank under label scarcity

no code implementations31 Jul 2023 Charlie Hou, Kiran Koshy Thekumparampil, Michael Shavlovsky, Giulia Fanti, Yesh Dattatreya, Sujay Sanghavi

On tabular data, a significant body of literature has shown that current deep learning (DL) models perform at best similarly to Gradient Boosted Decision Trees (GBDTs), while significantly underperforming them on outlier data.

Learning-To-Rank

Cannot find the paper you are looking for? You can Submit a new open access paper.