1 code implementation • 16 Jan 2024 • Linfeng Ye, Shayan Mohajer Hamidi, Renhao Tan, En-hui Yang
To improve this estimate for KD, in this paper we introduce the concept of conditional mutual information (CMI) into the estimation of BCPD and propose a novel estimator called the maximum CMI (MCMI) method.
no code implementations • 15 Jan 2024 • Shayan Mohajer Hamidi, Linfeng Ye
Deep neural networks (DNNs) could be deceived by generating human-imperceptible perturbations of clean samples.
no code implementations • 10 Jan 2024 • Shayan Mohajer Hamidi, En-hui Yang
AdaFed adaptively tunes this common direction based on the values of local gradients and loss functions.
no code implementations • 17 Sep 2023 • En-hui Yang, Shayan Mohajer Hamidi, Linfeng Ye, Renhao Tan, Beverly Yang
The concepts of conditional mutual information (CMI) and normalized conditional mutual information (NCMI) are introduced to measure the concentration and separation performance of a classification deep neural network (DNN) in the output probability distribution space of the DNN, where CMI and the ratio between CMI and NCMI represent the intra-class concentration and inter-class separation of the DNN, respectively.
no code implementations • 24 Nov 2021 • Linfeng Ye, Shayan Mohajer Hamidi
At the same time, the attack against a neural network is the key to improving its robustness.