no code implementations • 27 Feb 2024 • Jingwei Zhang, Cheuk Ting Li, Farzan Farnia
The massive developments of generative model frameworks and architectures require principled methods for the evaluation of a model's novelty compared to a reference dataset or baseline generative models.
no code implementations • 31 Oct 2023 • Mahmoud Hegazy, Rémi Leluc, Cheuk Ting Li, Aymeric Dieuleveut
Compression schemes have been extensively used in Federated Learning (FL) to reduce the communication cost of distributed learning.
1 code implementation • 29 Jan 2021 • Cheuk Ting Li
We present a versatile automated theorem proving framework capable of automated discovery, simplification and proofs of inner and outer bounds in network information theory, deduction of properties of information-theoretic quantities (e. g. Wyner and G\'acs-K\"orner common information), and discovery of non-Shannon-type inequalities, under a unified framework.
Automated Theorem Proving Information Theory Information Theory
1 code implementation • 13 Aug 2020 • Cheuk Ting Li
sequence of random variables $Z_{1},\ldots, Z_{n}$ that contains the same information as $X$, i. e., there exists an injective function $f$ such that $X=f(Z_{1},\ldots, Z_{n})$.
Information Theory Information Theory Probability 94A15, 60F05
1 code implementation • 14 Jun 2020 • Cheuk Ting Li
More precisely, we construct a coupling with entropy within 2 bits from the entropy of the greatest lower bound of $p_{1},\ldots, p_{m}$ with respect to majorization.
Information Theory Information Theory Probability
no code implementations • 21 Dec 2014 • Kim-Hung Li, Cheuk Ting Li
We learn from this phenomenon that when the size of the training data is large, we should either relax the assumption or apply NB to a "reduced" data set, say for example use NB as a local model.