Search Results for author: Gil I. Shamir

Found 8 papers, 2 papers with code

Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations

2 code implementations14 Feb 2022 Gil I. Shamir, Dong Lin

We describe a novel family of smooth activations; Smooth ReLU (SmeLU), designed to improve reproducibility with mathematical simplicity, with potentially cheaper implementation.

Click-Through Rate Prediction Recommendation Systems

Synthesizing Irreproducibility in Deep Networks

no code implementations21 Feb 2021 Robert R. Snapp, Gil I. Shamir

We show that even with a single nonlinearity and for very simple data and models, irreproducibility occurs.

Low Complexity Approximate Bayesian Logistic Regression for Sparse Online Learning

no code implementations28 Jan 2021 Gil I. Shamir, Wojciech Szpankowski

Various approximations that, for huge sparse feature sets, diminish the theoretical advantages, must be used.

regression Variational Inference

Smooth activations and reproducibility in deep networks

1 code implementation20 Oct 2020 Gil I. Shamir, Dong Lin, Lorenzo Coviello

We propose a new family of activations; Smooth ReLU (\emph{SmeLU}), designed to give such better tradeoffs, while also keeping the mathematical expression simple, and thus implementation cheap.

Anti-Distillation: Improving reproducibility of deep networks

no code implementations19 Oct 2020 Gil I. Shamir, Lorenzo Coviello

Deep networks have been revolutionary in improving performance of machine learning and artificial intelligence systems.

Logistic Regression Regret: What's the Catch?

no code implementations7 Feb 2020 Gil I. Shamir

We derive lower bounds with logarithmic regret under $L_1$, $L_2$, and $L_\infty$ constraints on the parameter values.


Cannot find the paper you are looking for? You can Submit a new open access paper.