Search Results for author: Markus Weimer

Found 18 papers, 4 papers with code

Large-Scale Automatic Audiobook Creation

no code implementations7 Sep 2023 Brendan Walsh, Mark Hamilton, Greg Newby, Xi Wang, Serena Ruan, Sheng Zhao, Lei He, Shaofei Zhang, Eric Dettinger, William T. Freeman, Markus Weimer

In this work, we present a system that can automatically generate high-quality audiobooks from online e-books.

A Tensor Compiler for Unified Machine Learning Prediction Serving

1 code implementation9 Oct 2020 Supun Nakandala, Karla Saur, Gyeong-In Yu, Konstantinos Karanasos, Carlo Curino, Markus Weimer, Matteo Interlandi

Machine Learning (ML) adoption in the enterprise requires simpler and more efficient software infrastructure---the bespoke solutions typical in large web companies are simply untenable.

BIG-bench Machine Learning

MLOS: An Infrastructure for Automated Software Performance Engineering

no code implementations1 Jun 2020 Carlo Curino, Neha Godwal, Brian Kroth, Sergiy Kuryata, Greg Lapinski, Si-Qi Liu, Slava Oks, Olga Poppe, Adam Smiechowski, Ed Thayer, Markus Weimer, Yiwen Zhu

In this paper we present: MLOS, an ML-powered infrastructure and methodology to democratize and automate Software Performance Engineering.

Vamsa: Automated Provenance Tracking in Data Science Scripts

no code implementations7 Jan 2020 Mohammad Hossein Namaki, Avrilia Floratou, Fotis Psallidas, Subru Krishnan, Ashvin Agrawal, Yinghui Wu, Yiwen Zhu, Markus Weimer

There has recently been a lot of ongoing research in the areas of fairness, bias and explainability of machine learning (ML) models due to the self-evident or regulatory requirements of various ML applications.

Fairness Recommendation Systems

Data Science through the looking glass and what we found there

no code implementations19 Dec 2019 Fotis Psallidas, Yiwen Zhu, Bojan Karlas, Matteo Interlandi, Avrilia Floratou, Konstantinos Karanasos, Wentao Wu, Ce Zhang, Subru Krishnan, Carlo Curino, Markus Weimer

The recent success of machine learning (ML) has led to an explosive growth both in terms of new systems and algorithms built in industry and academia, and new applications built by an ever-growing community of data science (DS) practitioners.

FLAML: A Fast and Lightweight AutoML Library

2 code implementations12 Nov 2019 Chi Wang, Qingyun Wu, Markus Weimer, Erkang Zhu

We study the problem of using low computational cost to automate the choices of learners and hyperparameters for an ad-hoc training dataset and error metric, by conducting trials of different configurations on the given training data.

Hyperparameter Optimization

Extending Relational Query Processing with ML Inference

no code implementations1 Nov 2019 Konstantinos Karanasos, Matteo Interlandi, Doris Xin, Fotis Psallidas, Rathijit Sen, Kwanghyun Park, Ivan Popivanov, Supun Nakandal, Subru Krishnan, Markus Weimer, Yuan Yu, Raghu Ramakrishnan, Carlo Curino

The broadening adoption of machine learning in the enterprise is increasing the pressure for strict governance and cost-effective performance, in particular for the common and consequential steps of model storage and inference.

PDP: A General Neural Framework for Learning SAT Solvers

no code implementations25 Sep 2019 Saeed Amizadeh, Sergiy Matusevych, Markus Weimer

There have been recent efforts for incorporating Graph Neural Network models for learning fully neural solvers for constraint satisfaction problems (CSP) and particularly Boolean satisfiability (SAT).

Making Classical Machine Learning Pipelines Differentiable: A Neural Translation Approach

1 code implementation10 Jun 2019 Gyeong-In Yu, Saeed Amizadeh, Sehoon Kim, Artidoro Pagnoni, Byung-Gon Chun, Markus Weimer, Matteo Interlandi

To this end, we propose a framework that translates a pre-trained ML pipeline into a neural network and fine-tunes the ML models within the pipeline jointly using backpropagation.

BIG-bench Machine Learning Translation

Learning To Solve Circuit-SAT: An Unsupervised Differentiable Approach

no code implementations ICLR 2019 Saeed Amizadeh, Sergiy Matusevych, Markus Weimer

Recent efforts to combine Representation Learning with Formal Methods, commonly known as the Neuro-Symbolic Methods, have given rise to a new trend of applying rich neural architectures to solve classical combinatorial optimization problems.

Combinatorial Optimization reinforcement-learning +2

PDP: A General Neural Framework for Learning Constraint Satisfaction Solvers

4 code implementations5 Mar 2019 Saeed Amizadeh, Sergiy Matusevych, Markus Weimer

In this paper, we propose a generic neural framework for learning CSP solvers that can be described in terms of probabilistic inference and yet learn search strategies beyond greedy search.

Batch-Expansion Training: An Efficient Optimization Framework

no code implementations22 Apr 2017 Michał Dereziński, Dhruv Mahajan, S. Sathiya Keerthi, S. V. N. Vishwanathan, Markus Weimer

We propose Batch-Expansion Training (BET), a framework for running a batch optimizer on a gradually expanding dataset.

Towards Geo-Distributed Machine Learning

no code implementations30 Mar 2016 Ignacio Cano, Markus Weimer, Dhruv Mahajan, Carlo Curino, Giovanni Matteo Fumarola

Current solutions to learning from geo-distributed data sources revolve around the idea of first centralizing the data in one data center, and then training locally.

BIG-bench Machine Learning

Parallelized Stochastic Gradient Descent

no code implementations NeurIPS 2010 Martin Zinkevich, Markus Weimer, Lihong Li, Alex J. Smola

With the increase in available data parallel machine learning has become an increasingly pressing problem.

Cannot find the paper you are looking for? You can Submit a new open access paper.