Search Results for author: Tamás Linder

Found 6 papers, 0 papers with code

Lossless Transformations and Excess Risk Bounds in Statistical Inference

no code implementations31 Jul 2023 László Györfi, Tamás Linder, Harro Walk

After characterizing lossless transformations, i. e., transformations for which the excess risk is zero for all loss functions, we construct a partitioning test statistic for the hypothesis that a given transformation is lossless and show that for i. i. d.

On the Rényi Cross-Entropy

no code implementations28 Jun 2022 Ferenc Cole Thierrin, Fady Alajaji, Tamás Linder

The R\'{e}nyi cross-entropy measure between two distributions, a generalization of the Shannon cross-entropy, was recently used as a loss function for the improved design of deep learning generative adversarial networks.

Gaussian Processes

An Information Bottleneck Problem with Rényi's Entropy

no code implementations29 Jan 2021 Jian-Jia Weng, Fady Alajaji, Tamás Linder

This paper considers an information bottleneck problem with the objective of obtaining a most informative representation of a hidden feature subject to a R\'enyi entropy complexity constraint.

Information Theory Information Theory

Signaling Games for Log-Concave Distributions: Number of Bins and Properties of Equilibria

no code implementations15 Dec 2020 Ertan Kazıklı, Serkan Sarıtaş, Sinan Gezici, Tamás Linder, Serdar Yüksel

For sources with two-sided unbounded support, we prove that, for any finite number of bins, there exists a unique equilibrium.

Quantization Information Theory Information Theory

Capacity of Generalized Discrete-Memoryless Push-to-Talk Two-Way Channels

no code implementations2 Apr 2019 Jian-Jia Weng, Fady Alajaji, Tamás Linder

In this report, we generalize Shannon's push-to-talk two-way channel (PTT-TWC) by allowing reliable full-duplex transmission as well as noisy reception in the half-duplex (PTT) mode.

Information Theory Information Theory

Information Extraction Under Privacy Constraints

no code implementations7 Nov 2015 Shahab Asoodeh, Mario Diaz, Fady Alajaji, Tamás Linder

To this end, the so-called {\em rate-privacy function} is introduced to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted from $Y$ under a privacy constraint between $X$ and the extracted information, where privacy is measured using either mutual information or maximal correlation.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.