Search Results for author: Amichai Painsky

Found 16 papers, 2 papers with code

Distribution Estimation under the Infinity Norm

no code implementations13 Feb 2024 Aryeh Kontorovich, Amichai Painsky

A variety of techniques are utilized and innovated upon, including Chernoff-type inequalities and empirical Bernstein bounds.

Optimal detection of non-overlapping images via combinatorial auction

no code implementations21 Jan 2024 Simon Anuk, Tamir Bendory, Amichai Painsky

This paper studies the classical problem of detecting the location of multiple image occurrences in a two-dimensional, noisy measurement.

Confidence Intervals for Unobserved Events

no code implementations6 Nov 2022 Amichai Painsky

Unobserved events are alphabet symbols which do not appear in the sample.

K-sample Multiple Hypothesis Testing for Signal Detection

no code implementations23 Sep 2022 Uriel Shiterburd, Tamir Bendory, Amichai Painsky

This paper studies the classical problem of estimating the locations of signal occurrences in a noisy measurement.

Detecting non-overlapping signals with dynamic programming

1 code implementation16 Aug 2022 Mordechai Roth, Amichai Painsky, Tamir Bendory

This paper studies the classical problem of detecting the locations of signal occurrences in a one-dimensional noisy measurement.

Feature Importance in Gradient Boosting Trees with Cross-Validation Feature Selection

no code implementations12 Sep 2021 Afek Ilay Adler, Amichai Painsky

The effect of this bias was extensively studied over the years, mostly in terms of predictive performance.

Feature Importance feature selection

Neural Joint Entropy Estimation

1 code implementation21 Dec 2020 Yuval Shalev, Amichai Painsky, Irad Ben-Gal

Estimating the entropy of a discrete random variable is a fundamental problem in information theory and related fields.

Data Compression Mutual Information Estimation

REPRESENTATION COMPRESSION AND GENERALIZATION IN DEEP NEURAL NETWORKS

no code implementations ICLR 2019 Ravid Shwartz-Ziv, Amichai Painsky, Naftali Tishby

Specifically, we show that the training of the network is characterized by a rapid increase in the mutual information (MI) between the layers and the target label, followed by a longer decrease in the MI between the layers and the input variable.

Information Plane

Non-linear Canonical Correlation Analysis: A Compressed Representation Approach

no code implementations31 Oct 2018 Amichai Painsky, Meir Feder, Naftali Tishby

In this work we introduce an information-theoretic compressed representation framework for the non-linear CCA problem (CRCCA), which extends the classical ACE approach.

Dimensionality Reduction Quantization +1

Lossless (and Lossy) Compression of Random Forests

no code implementations26 Oct 2018 Amichai Painsky, Saharon Rosset

In addition, we introduce a theoretically sound lossy compression scheme, which allows us to control the trade-off between the distortion and the coding rate.

Clustering

Linear Independent Component Analysis over Finite Fields: Algorithms and Bounds

no code implementations16 Sep 2018 Amichai Painsky, Saharon Rosset, Meir Feder

Importantly, we show that the overhead of our suggested algorithm (compared with the lower bound) typically decreases, as the scale of the problem grows.

PhD Dissertation: Generalized Independent Components Analysis Over Finite Alphabets

no code implementations13 Sep 2018 Amichai Painsky

Independent component analysis (ICA) is a statistical method for transforming an observable multi-dimensional random vector into components that are as statistically independent as possible from each other.

MSc Dissertation: Exclusive Row Biclustering for Gene Expression Using a Combinatorial Auction Approach

no code implementations13 Sep 2018 Amichai Painsky

In this paper we focus on the exclusive row biclustering problem for gene expression data sets, in which each row can only be a member of a single bicluster while columns can participate in multiple ones.

Clustering

Outperforming Good-Turing: Preliminary Report

no code implementations6 Jul 2018 Amichai Painsky, Meir Feder

Estimating a large alphabet probability distribution from a limited number of samples is a fundamental problem in machine learning and statistics.

Gaussian Lower Bound for the Information Bottleneck Limit

no code implementations7 Nov 2017 Amichai Painsky, Naftali Tishby

In this work we introduce a Gaussian lower bound to the IB curve; we find an embedding of the data which maximizes its "Gaussian part", on which we apply the GIB.

Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance

no code implementations10 Dec 2015 Amichai Painsky, Saharon Rosset

The most important consequence of our approach is that categorical variables with many categories can be safely used in tree building and are only chosen if they contribute to predictive power.

Variable Selection

Cannot find the paper you are looking for? You can Submit a new open access paper.