Search Results for author: Jan Pfeifer

Found 9 papers, 3 papers with code

Yggdrasil Decision Forests: A Fast and Extensible Decision Forests Library

1 code implementation6 Dec 2022 Mathieu Guillame-Bert, Sebastian Bruch, Richard Stotz, Jan Pfeifer

Yggdrasil Decision Forests is a library for the training, serving and interpretation of decision forest models, targeted both at research and production work, implemented in C++, and available in C++, command line interface, Python (under the name TensorFlow Decision Forests), JavaScript, Go, and Google Sheets (under the name Simple ML for Sheets).

Modeling Text with Decision Forests using Categorical-Set Splits

no code implementations21 Sep 2020 Mathieu Guillame-Bert, Sebastian Bruch, Petr Mitrichev, Petr Mikheev, Jan Pfeifer

We define a condition that is specific to categorical-set features -- defined as an unordered set of categorical variables -- and present an algorithm to learn it, thereby equipping decision forests with the ability to directly model text, albeit without preserving sequential order.

text-classification Text Classification

Deep Lattice Networks and Partial Monotonic Functions

no code implementations NeurIPS 2017 Seungil You, David Ding, Kevin Canini, Jan Pfeifer, Maya Gupta

We propose learning deep models that are monotonic with respect to a user-specified set of inputs by alternating layers of linear embeddings, ensembles of lattices, and calibrators (piecewise linear functions), with appropriate constraints for monotonicity, and jointly training the resulting network.

General Classification regression

Fast and Flexible Monotonic Functions with Ensembles of Lattices

no code implementations NeurIPS 2016 Mahdi Milani Fard, Kevin Canini, Andrew Cotter, Jan Pfeifer, Maya Gupta

For many machine learning problems, there are some inputs that are known to be positively (or negatively) related to the output, and in such cases training the model to respect that monotonic relationship can provide regularization, and makes the model more interpretable.

A Light Touch for Heavily Constrained SGD

no code implementations15 Dec 2015 Andrew Cotter, Maya Gupta, Jan Pfeifer

Minimizing empirical risk subject to a set of constraints can be a useful strategy for learning restricted classes of functions, such as monotonic functions, submodular functions, classifiers that guarantee a certain class label for some subset of examples, etc.

Cannot find the paper you are looking for? You can Submit a new open access paper.