Search Results for author: Kevin Canini

Found 7 papers, 0 papers with code

Deep Lattice Networks and Partial Monotonic Functions

no code implementations NeurIPS 2017 Seungil You, David Ding, Kevin Canini, Jan Pfeifer, Maya Gupta

We propose learning deep models that are monotonic with respect to a user-specified set of inputs by alternating layers of linear embeddings, ensembles of lattices, and calibrators (piecewise linear functions), with appropriate constraints for monotonicity, and jointly training the resulting network.

General Classification regression

Diminishing Returns Shape Constraints for Interpretability and Regularization

no code implementations NeurIPS 2018 Maya Gupta, Dara Bahri, Andrew Cotter, Kevin Canini

We investigate machine learning models that can provide diminishing returns and accelerating returns guarantees to capture prior knowledge or policies about how outputs should depend on inputs.

BIG-bench Machine Learning

Fast and Flexible Monotonic Functions with Ensembles of Lattices

no code implementations NeurIPS 2016 Mahdi Milani Fard, Kevin Canini, Andrew Cotter, Jan Pfeifer, Maya Gupta

For many machine learning problems, there are some inputs that are known to be positively (or negatively) related to the output, and in such cases training the model to respect that monotonic relationship can provide regularization, and makes the model more interpretable.

Launch and Iterate: Reducing Prediction Churn

no code implementations NeurIPS 2016 Mahdi Milani Fard, Quentin Cormier, Kevin Canini, Maya Gupta

Practical applications of machine learning often involve successive training iterations with changes to features and training examples.

Regularization Strategies for Quantile Regression

no code implementations9 Feb 2021 Taman Narayan, Serena Wang, Kevin Canini, Maya Gupta

We show that minimizing an expected pinball loss over a continuous distribution of quantiles is a good regularizer even when only predicting a specific quantile.

Fairness regression

Fast Linear Interpolation for Piecewise-Linear Functions, GAMs, and Deep Lattice Networks

no code implementations25 Sep 2019 Nathan Zhang, Kevin Canini, Sean Silva, and Maya R. Gupta

We present fast implementations of linear interpolation operators for both piecewise linear functions and multi-dimensional look-up tables.

Cannot find the paper you are looking for? You can Submit a new open access paper.