no code implementations • 19 Feb 2023 • Oren Yuval, Saharon Rosset
The main idea is to design different mechanisms for integrating the unlabeled data, and include in each of them a mixing parameter $\alpha$, controlling the weight given to the unlabeled data.
1 code implementation • 7 Jun 2022 • Giora Simchoni, Saharon Rosset
We propose to use the mixed models framework to handle correlated data in DNNs.
1 code implementation • NeurIPS 2021 • Giora Simchoni, Saharon Rosset
Our results show that treating high-cardinality categorical features as random effects leads to a significant improvement in prediction performance compared to state of the art alternatives.
1 code implementation • 16 Feb 2021 • Assaf Rabinowicz, Saharon Rosset
This paper presents a new approach for trees-based regression, such as simple regression tree, random forest and gradient boosting, in settings involving correlated data.
no code implementations • 1 Sep 2020 • Oren Yuval, Saharon Rosset
The key ideas are carefully considering the null model as a competitor, and utilizing the unlabeled data to determine signal-noise combinations where SSL outperforms both supervised learning and the null model.
no code implementations • 19 Mar 2019 • Trevor Hastie, Andrea Montanari, Saharon Rosset, Ryan J. Tibshirani
Interpolators -- estimators that achieve zero training error -- have attracted growing attention in machine learning, mainly because state-of-the art neural networks appear to be models of this type.
1 code implementation • 25 Jan 2019 • Amit Moscovich, Saharon Rosset
Cross-validation is the de facto standard for predictive model evaluation and selection.
1 code implementation • 10 Dec 2018 • Aviv Navon, Saharon Rosset
This setting naturally induces a group structure over the coefficient matrix, in which every explanatory variable corresponds to a set of related coefficients.
no code implementations • 26 Oct 2018 • Amichai Painsky, Saharon Rosset
In addition, we introduce a theoretically sound lossy compression scheme, which allows us to control the trade-off between the distortion and the coding rate.
no code implementations • 16 Sep 2018 • Amichai Painsky, Saharon Rosset, Meir Feder
Importantly, we show that the overhead of our suggested algorithm (compared with the lower bound) typically decreases, as the scale of the problem grows.
no code implementations • NeurIPS 2018 • Blake Woodworth, Vitaly Feldman, Saharon Rosset, Nathan Srebro
The problem of handling adaptivity in data analysis, intentional or not, permeates a variety of fields, including test-set overfitting in ML challenges and the accumulation of invalid scientific discoveries.
no code implementations • 10 Dec 2015 • Amichai Painsky, Saharon Rosset
The most important consequence of our approach is that categorical variables with many categories can be safely used in tree building and are only chosen if they contribute to predictive power.
no code implementations • 12 Nov 2013 • Shachar Kaufman, Saharon Rosset
Regularization aims to improve prediction performance of a given statistical modeling approach by moving to a second approach which achieves worse training error but is expected to have fewer degrees of freedom, i. e., better agreement between training and prediction error.
no code implementations • NeurIPS 2010 • Ronny Luss, Saharon Rosset, Moni Shahar
A new algorithm for isotonic regression is presented based on recursively partitioning the solution space.