no code implementations • 15 Apr 2024 • Kai Yi, Nidham Gazagnadou, Peter Richtárik, Lingjuan Lyu
The interest in federated learning has surged in recent research due to its unique ability to train a global model using privacy-secured information held locally on each client.
1 code implementation • 20 Mar 2024 • Yizhu Wen, Kai Yi, Jing Ke, Yiqing Shen
Specifically, DiffImpute is trained on complete tabular datasets, ensuring that it can produce credible imputations for missing entries without undermining the authenticity of the existing data.
no code implementations • 14 Mar 2024 • Kai Yi, Georg Meinhardt, Laurent Condat, Peter Richtárik
Federated Learning (FL) has garnered increasing attention due to its unique characteristic of allowing heterogeneous clients to process their private data locally and interact with a central server, while being respectful of privacy.
1 code implementation • ICCV 2023 • Wenxuan Zhang, Paul Janson, Kai Yi, Ivan Skorokhodov, Mohamed Elhoseiny
The GRW loss augments the training by continually encouraging the model to generate realistic and characterized samples to represent the unseen space.
1 code implementation • NeurIPS 2023 • Kai Yi, Bingxin Zhou, Yiqing Shen, Pietro Liò, Yu Guang Wang
In contrast, diffusion probabilistic models, as an emerging genre of generative approaches, offer the potential to generate a diverse set of sequence candidates for determined protein backbones.
1 code implementation • 22 May 2023 • Kai Yi, Laurent Condat, Peter Richtárik
Federated Learning is an evolving machine learning paradigm, in which multiple clients perform computations based on their individual private data, interspersed by communication with a remote server.
no code implementations • 13 Apr 2023 • Bingxin Zhou, Outongyi Lv, Kai Yi, Xinye Xiong, Pan Tan, Liang Hong, Yu Guang Wang
Directed evolution as a widely-used engineering strategy faces obstacles in finding desired mutants from the massive size of candidate modifications.
1 code implementation • 9 Jul 2022 • Grigory Malinovsky, Kai Yi, Peter Richtárik
We study distributed optimization methods based on the {\em local training (LT)} paradigm: achieving communication efficiency by performing richer local gradient-based training on the clients before parameter averaging.
no code implementations • 17 Jun 2022 • Kai Yi, Jialin Chen, Yu Guang Wang, Bingxin Zhou, Pietro Liò, Yanan Fan, Jan Hamann
This paper develops a rotation-invariant needlet convolution for rotation group SO(3) to distill multiscale information of spherical signals.
1 code implementation • 11 Jun 2022 • Yuelin Wang, Kai Yi, Xinliang Liu, Yu Guang Wang, Shi Jin
Neural message passing is a basic feature extraction unit for graph-structured data considering neighboring node features in network propagation from one layer to the next.
1 code implementation • 9 May 2022 • Laurent Condat, Kai Yi, Peter Richtárik
Our general approach works with a new, larger class of compressors, which has two parameters, the bias and the variance, and includes unbiased and biased compressors as particular cases.
1 code implementation • 2 Mar 2022 • Kai Yi, Xiaoqian Shen, Yunhao Gou, Mohamed Elhoseiny
The main question we address in this paper is how to scale up visual recognition of unseen classes, also known as zero-shot learning, to tens of thousands of categories as in the ImageNet-21K benchmark.
no code implementations • 24 Dec 2021 • Kai Yi, Paul Janson, Wenxuan Zhang, Mohamed Elhoseiny
Accordingly, we propose a Domain-Invariant Network (DIN) to learn factorized features for shifting domains and improved textual representation for unseen classes.
no code implementations • 27 Jun 2021 • Kai Yi, Jianye Pang, Yungeng Zhang, Xiangrui Zeng, Min Xu
Cryo-electron tomography (Cryo-ET) is a 3D imaging technique that enables the systemic study of shape, abundance, and distribution of macromolecular structures in single cells in near-atomic resolution.
1 code implementation • 20 Apr 2021 • Divyansh Jha, Kai Yi, Ivan Skorokhodov, Mohamed Elhoseiny
By generating representations of unseen classes based on their semantic descriptions, e. g., attributes or text, generative ZSL attempts to differentiate unseen from seen categories.
1 code implementation • CVPR 2022 • Jun Chen, Han Guo, Kai Yi, Boyang Li, Mohamed Elhoseiny
To the best of our knowledge, this is the first work that improves data efficiency of image captioning by utilizing LM pretrained on unimodal data.
2 code implementations • 1 Jan 2021 • Mohamed Elhoseiny, Kai Yi, Mohamed Elfeki
To improve the discriminative power of ZSL, we model the visual learning process of unseen categories with inspiration from the psychology of human creativity for producing novel art.
no code implementations • 12 Aug 2020 • Jianye Pang, Kai Yi, Wanguang Yin, Min Xu
In this technical report, we analyze Legendre decomposition for non-negative tensor in theory and application.
1 code implementation • 31 Jan 2020 • Kai Yi, Yi Guo, Yanan Fan, Jan Hamann, Yu Guang Wang
The noise of the CMB map has a significant impact on the estimation precision for cosmological parameters.
2 code implementations • 31 Jan 2020 • Nicole Hallett, Kai Yi, Josef Dick, Christopher Hodge, Gerard Sutton, Yu Guang Wang, Jingjing You
Currently, there is no cure for keratoconus other than corneal transplantation for advanced stage keratoconus or corneal cross-linking, which can only halt KC progression.
no code implementations • 13 Mar 2018 • Kai Yi, Zhiqiang Jian, Shitao Chen, Nanning Zheng
At present, the performance of deep neural network in general object detection is comparable to or even surpasses that of human beings.