Search Results for author: Ming Hu

Found 33 papers, 7 papers with code

Personalized Federated Instruction Tuning via Neural Architecture Search

no code implementations26 Feb 2024 Pengyu Zhang, Yingbo Zhou, Ming Hu, Junxian Feng, Jiawen Weng, Mingsong Chen

Federated Instruction Tuning (FIT) has shown the ability to achieve collaborative model instruction tuning among massive data owners without sharing private data.

Neural Architecture Search

MIP: CLIP-based Image Reconstruction from PEFT Gradients

no code implementations26 Feb 2024 Peiheng Zhou, Ming Hu, Xiaofei Xie, Yihao Huang, Kangjie Chen, Mingsong Chen

Contrastive Language-Image Pre-training (CLIP) model, as an effective pre-trained multimodal neural network, has been widely used in distributed machine learning tasks, especially Federated Learning (FL).

Federated Learning Image Reconstruction +1

Structure-based out-of-distribution (OOD) materials property prediction: a benchmark study

1 code implementation16 Jan 2024 Sadman Sadeed Omee, Nihang Fu, Rongzhi Dong, Ming Hu, Jianjun Hu

In real-world material research, machine learning (ML) models are usually expected to predict and discover novel exceptional materials that deviate from the known materials.

Property Prediction

AdapterFL: Adaptive Heterogeneous Federated Learning for Resource-constrained Mobile Computing Systems

no code implementations23 Nov 2023 Ruixuan Liu, Ming Hu, Zeke Xia, Jun Xia, Pengyu Zhang, Yihao Huang, Yang Liu, Mingsong Chen

On the one hand, to achieve model training in all the diverse clients, mobile computing systems can only use small low-performance models for collaborative learning.

Federated Learning

AdaptiveFL: Adaptive Heterogeneous Federated Learning for Resource-Constrained AIoT Systems

no code implementations22 Nov 2023 Chentao Jia, Ming Hu, Zekai Chen, Yanxin Yang, Xiaofei Xie, Yang Liu, Mingsong Chen

Although Federated Learning (FL) is promising to enable collaborative learning among Artificial Intelligence of Things (AIoT) devices, it suffers from the problem of low classification performance due to various heterogeneity factors (e. g., computing capacity, memory size) of devices and uncertain operating environments.

Federated Learning

Have Your Cake and Eat It Too: Toward Efficient and Accurate Split Federated Learning

no code implementations22 Nov 2023 Dengke Yan, Ming Hu, Zeke Xia, Yanxin Yang, Jun Xia, Xiaofei Xie, Mingsong Chen

However, due to data heterogeneity and stragglers, SFL suffers from the challenges of low inference accuracy and low efficiency.

Federated Learning

MammalNet: A Large-scale Video Benchmark for Mammal Recognition and Behavior Understanding

no code implementations CVPR 2023 Jun Chen, Ming Hu, Darren J. Coker, Michael L. Berumen, Blair Costelloe, Sara Beery, Anna Rohrbach, Mohamed Elhoseiny

Monitoring animal behavior can facilitate conservation efforts by providing key insights into wildlife health, population status, and ecosystem function.

Personalization as a Shortcut for Few-Shot Backdoor Attack against Text-to-Image Diffusion Models

no code implementations18 May 2023 Yihao Huang, Felix Juefei-Xu, Qing Guo, Jie Zhang, Yutong Wu, Ming Hu, Tianlin Li, Geguang Pu, Yang Liu

Although recent personalization methods have democratized high-resolution image synthesis by enabling swift concept acquisition with minimal examples and lightweight computation, they also present an exploitable avenue for high accessible backdoor attacks.

Backdoor Attack Image Generation

FedMR: Federated Learning via Model Recombination

no code implementations18 May 2023 Ming Hu, Zhihao Yue, Zhiwei Ling, Yihao Huang, Cheng Chen, Xian Wei, Yang Liu, Mingsong Chen

Although Federated Learning (FL) enables global model training across clients without compromising their raw data, existing Federated Averaging (FedAvg)-based methods suffer from the problem of low inference performance, especially for unevenly distributed data among clients.

Federated Learning

Architecture-agnostic Iterative Black-box Certified Defense against Adversarial Patches

no code implementations18 May 2023 Di Yang, Yihao Huang, Qing Guo, Felix Juefei-Xu, Ming Hu, Yang Liu, Geguang Pu

The adversarial patch attack aims to fool image classifiers within a bounded, contiguous region of arbitrary changes, posing a real threat to computer vision systems (e. g., autonomous driving, content moderation, biometric authentication, medical imaging) in the physical world.

Autonomous Driving

LMPT: Prompt Tuning with Class-Specific Embedding Loss for Long-tailed Multi-Label Visual Recognition

1 code implementation8 May 2023 Peng Xia, Di Xu, Lie Ju, Ming Hu, Jun Chen, ZongYuan Ge

Long-tailed multi-label visual recognition (LTML) task is a highly challenging task due to the label co-occurrence and imbalanced data distribution.

 Ranked #1 on Long-tail Learning on COCO-MLT (using extra training data)

Long-tail Learning

Towards Interpretable Federated Learning

no code implementations27 Feb 2023 Anran Li, Rui Liu, Ming Hu, Luu Anh Tuan, Han Yu

Federated learning (FL) enables multiple data owners to build machine learning models collaboratively without exposing their private local data.

Federated Learning

CyclicFL: A Cyclic Model Pre-Training Approach to Efficient Federated Learning

no code implementations28 Jan 2023 Pengyu Zhang, Yingbo Zhou, Ming Hu, Xin Fu, Xian Wei, Mingsong Chen

Based on the concept of Continual Learning (CL), we prove that CyclicFL approximates existing centralized pre-training methods in terms of classification and prediction performance.

Continual Learning Federated Learning

HierarchyFL: Heterogeneous Federated Learning via Hierarchical Self-Distillation

no code implementations5 Dec 2022 Jun Xia, Yi Zhang, Zhihao Yue, Ming Hu, Xian Wei, Mingsong Chen

Federated learning (FL) has been recognized as a privacy-preserving distributed machine learning paradigm that enables knowledge sharing among various heterogeneous artificial intelligence (AIoT) devices through centralized global model aggregation.

Federated Learning Privacy Preserving

GitFL: Adaptive Asynchronous Federated Learning using Version Control

no code implementations22 Nov 2022 Ming Hu, Zeke Xia, Zhihao Yue, Jun Xia, Yihao Huang, Yang Liu, Mingsong Chen

Unlike traditional FL, the cloud server of GitFL maintains a master model (i. e., the global model) together with a set of branch models indicating the trained local models committed by selected devices, where the master model is updated based on both all the pushed branch models and their version information, and only the branch models after the pull operation are dispatched to devices.

Federated Learning Reinforcement Learning (RL)

Algorithmic Decision-Making Safeguarded by Human Knowledge

no code implementations20 Nov 2022 Ningyuan Chen, Ming Hu, Wenhao Li

In view of such a conflict, we provide a general analytical framework to study the augmentation of algorithmic decisions with human knowledge: the analyst uses the knowledge to set a guardrail by which the algorithmic decision is clipped if the algorithmic output is out of bound, and seems unreasonable.

Decision Making

FedCross: Towards Accurate Federated Learning via Multi-Model Cross Aggregation

no code implementations15 Oct 2022 Ming Hu, Peiheng Zhou, Zhihao Yue, Zhiwei Ling, Yihao Huang, Yang Liu, Mingsong Chen

Due to the remarkable performance in preserving data privacy for decentralized data scenarios, Federated Learning (FL) has been considered as a promising distributed machine learning paradigm to deal with data silos problems.

Federated Learning

FedMR: Fedreated Learning via Model Recombination

no code implementations16 Aug 2022 Ming Hu, Zhihao Yue, Zhiwei Ling, Xian Wei, Mingsong Chen

Worse still, in each round of FL training, FedAvg dispatches the same initial local models to clients, which can easily result in stuck-at-local-search for optimal global models.

Federated Learning Privacy Preserving

Efficient Self-supervised Vision Pretraining with Local Masked Reconstruction

1 code implementation1 Jun 2022 Jun Chen, Ming Hu, Boyang Li, Mohamed Elhoseiny

After finetuning the pretrained LoMaR on 384$\times$384 images, it can reach 85. 4% top-1 accuracy, surpassing MAE by 0. 6%.

Image Classification Instance Segmentation +3

FedEntropy: Efficient Device Grouping for Federated Learning Using Maximum Entropy Judgment

1 code implementation24 May 2022 Zhiwei Ling, Zhihao Yue, Jun Xia, Ming Hu, Ting Wang, Mingsong Chen

Along with the popularity of Artificial Intelligence (AI) and Internet-of-Things (IoT), Federated Learning (FL) has attracted steadily increasing attentions as a promising distributed machine learning paradigm, which enables the training of a central model on for numerous decentralized devices without exposing their privacy.

Federated Learning

Model-Contrastive Learning for Backdoor Defense

1 code implementation9 May 2022 Zhihao Yue, Jun Xia, Zhiwei Ling, Ming Hu, Ting Wang, Xian Wei, Mingsong Chen

Due to the popularity of Artificial Intelligence (AI) techniques, we are witnessing an increasing number of backdoor injection attacks that are designed to maliciously threaten Deep Neural Networks (DNNs) causing misclassification.

Backdoor Attack backdoor defense +1

FedCAT: Towards Accurate Federated Learning via Device Concatenation

no code implementations23 Feb 2022 Ming Hu, Tian Liu, Zhiwei Ling, Zhihao Yue, Mingsong Chen

As a promising distributed machine learning paradigm, Federated Learning (FL) enables all the involved devices to train a global model collaboratively without exposing their local data privacy.

Federated Learning

Predicting Lattice Phonon Vibrational Frequencies Using Deep Graph Neural Networks

no code implementations10 Nov 2021 Nghia Nguyen, Steph-Yves Louis, Lai Wei, Kamal Choudhary, Ming Hu, Jianjun Hu

Our work demonstrates the capability of deep graph neural networks to learn to predict phonon spectrum properties of crystal structures in addition to phonon density of states (DOS) and electronic DOS in which the output dimension is constant.

Materials Screening

Phonon Scattering in the Complex Strain Field of a Dislocation

no code implementations26 Jan 2021 Yandong Sun, Yanguang Zhou, Ramya Gurunathan, Jin-Yu Zhang, Ming Hu, Wei Liu, Ben Xu, G. Jeffrey Snyder

Strain engineering is critical to the performance enhancement of electronic and thermoelectric devices because of its influence on the material thermal conductivity.

Materials Science

Probing the Phonon Mean Free Paths in Dislocation Core by Molecular Dynamics Simulation

no code implementations18 Oct 2020 Yandong Sun, Yanguang Zhou, Ming Hu, G. Jeffrey Snyder, Ben Xu, Wei Liu

In this study, the 1D McKelvey-Shockley phonon BTE method was extended to model inhomogeneous materials, where the effect of defects on the phonon MFPs is explicitly obtained.

Materials Science Computational Physics 80A05 I.6.0

Predicting Elastic Properties of Materials from Electronic Charge Density Using 3D Deep Convolutional Neural Networks

no code implementations17 Mar 2020 Yong Zhao, Kunpeng Yuan, Yinqiao Liu, Steph-Yves Louis, Ming Hu, Jianjun Hu

Extensive benchmark experiments over 2, 170 Fm-3m face-centered-cubic (FCC) materials show that our ECD based CNNs can achieve good performance for elasticity prediction.

Property Prediction

Machine Learning based prediction of noncentrosymmetric crystal materials

no code implementations26 Feb 2020 Yuqi Song, Joseph Lindsay, Yong Zhao, Alireza Nasiri, Steph-Yves Louis, Jie Ling, Ming Hu, Jianjun Hu

Noncentrosymmetric materials play a critical role in many important applications such as laser technology, communication systems, quantum computing, cybersecurity, and etc.

BIG-bench Machine Learning

Generative adversarial networks (GAN) based efficient sampling of chemical space for inverse design of inorganic materials

no code implementations12 Nov 2019 Yabo Dan, Yong Zhao, Xiang Li, Shaobo Li, Ming Hu, Jianjun Hu

The percentage of chemically valid (charge neutral and electronegativity balanced) samples out of all generated ones reaches 84. 5% by our GAN when trained with materials from ICSD even though no such chemical rules are explicitly enforced in our GAN model, indicating its capability to learn implicit chemical composition rules.

Generative Adversarial Network valid

DIMM-SC: A Dirichlet mixture model for clustering droplet-based single cell transcriptomic data

no code implementations6 Apr 2017 Zhe Sun, Ting Wang, Ke Deng, Xiao-Feng Wang, Robert Lafyatis, Ying Ding, Ming Hu, Wei Chen

More importantly, as a model-based approach, DIMM-SC is able to quantify the clustering uncertainty for each single cell, facilitating rigorous statistical inference and biological interpretations, which are typically unavailable from existing clustering methods.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.