no code implementations • 31 Jul 2023 • László Györfi, Tamás Linder, Harro Walk
After characterizing lossless transformations, i. e., transformations for which the excess risk is zero for all loss functions, we construct a partitioning test statistic for the hypothesis that a given transformation is lossless and show that for i. i. d.
no code implementations • 28 Jun 2022 • Ferenc Cole Thierrin, Fady Alajaji, Tamás Linder
The R\'{e}nyi cross-entropy measure between two distributions, a generalization of the Shannon cross-entropy, was recently used as a loss function for the improved design of deep learning generative adversarial networks.
no code implementations • 29 Jan 2021 • Jian-Jia Weng, Fady Alajaji, Tamás Linder
This paper considers an information bottleneck problem with the objective of obtaining a most informative representation of a hidden feature subject to a R\'enyi entropy complexity constraint.
Information Theory Information Theory
no code implementations • 15 Dec 2020 • Ertan Kazıklı, Serkan Sarıtaş, Sinan Gezici, Tamás Linder, Serdar Yüksel
For sources with two-sided unbounded support, we prove that, for any finite number of bins, there exists a unique equilibrium.
Quantization Information Theory Information Theory
no code implementations • 2 Apr 2019 • Jian-Jia Weng, Fady Alajaji, Tamás Linder
In this report, we generalize Shannon's push-to-talk two-way channel (PTT-TWC) by allowing reliable full-duplex transmission as well as noisy reception in the half-duplex (PTT) mode.
Information Theory Information Theory
no code implementations • 7 Nov 2015 • Shahab Asoodeh, Mario Diaz, Fady Alajaji, Tamás Linder
To this end, the so-called {\em rate-privacy function} is introduced to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted from $Y$ under a privacy constraint between $X$ and the extracted information, where privacy is measured using either mutual information or maximal correlation.