Search Results for author: Bram Wasti

Found 5 papers, 2 papers with code

LoopTune: Optimizing Tensor Computations with Reinforcement Learning

no code implementations4 Sep 2023 Dejan Grubisic, Bram Wasti, Chris Cummins, John Mellor-Crummey, Aleksandar Zlateski

Advanced compiler technology is crucial for enabling machine learning applications to run on novel hardware, but traditional compilers fail to deliver performance, popular auto-tuners have long search times and expert-optimized libraries introduce unsustainable costs.

reinforcement-learning

LoopStack: a Lightweight Tensor Algebra Compiler Stack

1 code implementation2 May 2022 Bram Wasti, José Pablo Cambronero, Benoit Steiner, Hugh Leather, Aleksandar Zlateski

We present LoopStack, a domain specific compiler stack for tensor operations, composed of a frontend, LoopTool, and an efficient optimizing code generator, LoopNest.

BIG-bench Machine Learning

CompilerGym: Robust, Performant Compiler Optimization Environments for AI Research

1 code implementation17 Sep 2021 Chris Cummins, Bram Wasti, Jiadong Guo, Brandon Cui, Jason Ansel, Sahir Gomez, Somya Jain, Jia Liu, Olivier Teytaud, Benoit Steiner, Yuandong Tian, Hugh Leather

What is needed is an easy, reusable experimental infrastructure for real world compiler optimization tasks that can serve as a common benchmark for comparing techniques, and as a platform to accelerate progress in the field.

Compiler Optimization OpenAI Gym

Semisupervised Learning on Heterogeneous Graphs and its Applications to Facebook News Feed

no code implementations18 May 2018 Cheng Ju, James Li, Bram Wasti, Shengbo Guo

We show that the HELP algorithm improves the predictive performance across multiple tasks, together with semantically meaningful embedding that are discriminative for downstream classification or regression tasks.

Classification domain classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.