no code implementations • CVPR 2023 • Jun Chen, Ming Hu, Darren J. Coker, Michael L. Berumen, Blair Costelloe, Sara Beery, Anna Rohrbach, Mohamed Elhoseiny
Monitoring animal behavior can facilitate conservation efforts by providing key insights into wildlife health, population status, and ecosystem function.
no code implementations • 18 May 2023 • Ming Hu, Zhihao Yue, Zhiwei Ling, Yihao Huang, Cheng Chen, Xian Wei, Yang Liu, Mingsong Chen
Although Federated Learning (FL) enables global model training across clients without compromising their raw data, existing Federated Averaging (FedAvg)-based methods suffer from the problem of low inference performance, especially for unevenly distributed data among clients.
no code implementations • 18 May 2023 • Di Yang, Yihao Huang, Qing Guo, Felix Juefei-Xu, Ming Hu, Yang Liu, Geguang Pu
The adversarial patch attack aims to fool image classifiers within a bounded, contiguous region of arbitrary changes, posing a real threat to computer vision systems (e. g., autonomous driving, content moderation, biometric authentication, medical imaging) in the physical world.
1 code implementation • 8 May 2023 • Peng Xia, Di Xu, Lie Ju, Ming Hu, Jun Chen, ZongYuan Ge
Long-tailed multi-label visual recognition (LTML) task is a highly challenging task due to the label co-occurrence and imbalanced data distribution.
Ranked #1 on
Long-tail Learning
on COCO-MLT
(using extra training data)
no code implementations • 27 Feb 2023 • Anran Li, Rui Liu, Ming Hu, Luu Anh Tuan, Han Yu
Federated learning (FL) enables multiple data owners to build machine learning models collaboratively without exposing their private local data.
no code implementations • 28 Jan 2023 • Pengyu Zhang, Yingbo Zhou, Ming Hu, Xin Fu, Xian Wei, Mingsong Chen
Based on the concept of Continual Learning (CL), we prove that CyclicFL approximates existing centralized pre-training methods in terms of classification and prediction performance.
no code implementations • 5 Dec 2022 • Jun Xia, Yi Zhang, Zhihao Yue, Ming Hu, Xian Wei, Mingsong Chen
Federated learning (FL) has been recognized as a privacy-preserving distributed machine learning paradigm that enables knowledge sharing among various heterogeneous artificial intelligence (AIoT) devices through centralized global model aggregation.
no code implementations • 22 Nov 2022 • Ming Hu, Zeke Xia, Zhihao Yue, Jun Xia, Yihao Huang, Yang Liu, Mingsong Chen
Unlike traditional FL, the cloud server of GitFL maintains a master model (i. e., the global model) together with a set of branch models indicating the trained local models committed by selected devices, where the master model is updated based on both all the pushed branch models and their version information, and only the branch models after the pull operation are dispatched to devices.
no code implementations • 20 Nov 2022 • Ningyuan Chen, Ming Hu, Wenhao Li
In view of such a conflict, we provide a general analytical framework to study the augmentation of algorithmic decisions with human knowledge: the analyst uses the knowledge to set a guardrail by which the algorithmic decision is clipped if the algorithmic output is out of bound, and seems unreasonable.
no code implementations • 15 Oct 2022 • Ming Hu, Peiheng Zhou, Zhihao Yue, Zhiwei Ling, Yihao Huang, Yang Liu, Mingsong Chen
Due to the remarkable performance in preserving data privacy for decentralized data scenarios, Federated Learning (FL) has been considered as a promising distributed machine learning paradigm to deal with data silos problems.
no code implementations • 16 Aug 2022 • Ming Hu, Zhihao Yue, Zhiwei Ling, Xian Wei, Mingsong Chen
Worse still, in each round of FL training, FedAvg dispatches the same initial local models to clients, which can easily result in stuck-at-local-search for optimal global models.
1 code implementation • 1 Jun 2022 • Jun Chen, Ming Hu, Boyang Li, Mohamed Elhoseiny
After finetuning the pretrained LoMaR on 384$\times$384 images, it can reach 85. 4% top-1 accuracy, surpassing MAE by 0. 6%.
1 code implementation • 24 May 2022 • Zhiwei Ling, Zhihao Yue, Jun Xia, Ming Hu, Ting Wang, Mingsong Chen
Along with the popularity of Artificial Intelligence (AI) and Internet-of-Things (IoT), Federated Learning (FL) has attracted steadily increasing attentions as a promising distributed machine learning paradigm, which enables the training of a central model on for numerous decentralized devices without exposing their privacy.
1 code implementation • 9 May 2022 • Zhihao Yue, Jun Xia, Zhiwei Ling, Ming Hu, Ting Wang, Xian Wei, Mingsong Chen
Due to the popularity of Artificial Intelligence (AI) techniques, we are witnessing an increasing number of backdoor injection attacks that are designed to maliciously threaten Deep Neural Networks (DNNs) causing misclassification.
1 code implementation • 27 Mar 2022 • Yong Zhao, Edirisuriya M. Dilanga Siriwardane, Zhenyao Wu, Nihang Fu, Mohammed Al-Fahdi, Ming Hu, Jianjun Hu
Discovering new materials is a challenging task in materials science crucial to the progress of human society.
no code implementations • 23 Feb 2022 • Ming Hu, Tian Liu, Zhiwei Ling, Zhihao Yue, Mingsong Chen
As a promising distributed machine learning paradigm, Federated Learning (FL) enables all the involved devices to train a global model collaboratively without exposing their local data privacy.
no code implementations • 10 Nov 2021 • Nghia Nguyen, Steph-Yves Louis, Lai Wei, Kamal Choudhary, Ming Hu, Jianjun Hu
Our work demonstrates the capability of deep graph neural networks to learn to predict phonon spectrum properties of crystal structures in addition to phonon density of states (DOS) and electronic DOS in which the output dimension is constant.
no code implementations • 26 Jan 2021 • Yandong Sun, Yanguang Zhou, Ramya Gurunathan, Jin-Yu Zhang, Ming Hu, Wei Liu, Ben Xu, G. Jeffrey Snyder
Strain engineering is critical to the performance enhancement of electronic and thermoelectric devices because of its influence on the material thermal conductivity.
Materials Science
no code implementations • 18 Oct 2020 • Yandong Sun, Yanguang Zhou, Ming Hu, G. Jeffrey Snyder, Ben Xu, Wei Liu
In this study, the 1D McKelvey-Shockley phonon BTE method was extended to model inhomogeneous materials, where the effect of defects on the phonon MFPs is explicitly obtained.
Materials Science Computational Physics 80A05 I.6.0
no code implementations • 17 Mar 2020 • Yong Zhao, Kunpeng Yuan, Yinqiao Liu, Steph-Yves Louis, Ming Hu, Jianjun Hu
Extensive benchmark experiments over 2, 170 Fm-3m face-centered-cubic (FCC) materials show that our ECD based CNNs can achieve good performance for elasticity prediction.
no code implementations • 26 Feb 2020 • Yuqi Song, Joseph Lindsay, Yong Zhao, Alireza Nasiri, Steph-Yves Louis, Jie Ling, Ming Hu, Jianjun Hu
Noncentrosymmetric materials play a critical role in many important applications such as laser technology, communication systems, quantum computing, cybersecurity, and etc.
no code implementations • 12 Nov 2019 • Yabo Dan, Yong Zhao, Xiang Li, Shaobo Li, Ming Hu, Jianjun Hu
The percentage of chemically valid (charge neutral and electronegativity balanced) samples out of all generated ones reaches 84. 5% by our GAN when trained with materials from ICSD even though no such chemical rules are explicitly enforced in our GAN model, indicating its capability to learn implicit chemical composition rules.
no code implementations • WS 2018 • Dana Abu Ali, Muaz Ahmad, Hayat Al Hassan, Paula Dozsa, Ming Hu, Jose Varias, Nizar Habash
This demonstration paper presents a bilingual (Arabic-English) interactive human avatar dialogue system.
no code implementations • 6 Apr 2017 • Zhe Sun, Ting Wang, Ke Deng, Xiao-Feng Wang, Robert Lafyatis, Ying Ding, Ming Hu, Wei Chen
More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods.