Search Results for author: Mung Chiang

Found 24 papers, 5 papers with code

Asynchronous Multi-Model Dynamic Federated Learning over Wireless Networks: Theory, Modeling, and Optimization

no code implementations22 May 2023 Zhan-Lun Chang, Seyyedali Hosseinalipour, Mung Chiang, Christopher G. Brinton

Our convergence analysis sheds light on the impact of resource allocation, device scheduling, and individual model states on the performance of ML models.

Federated Learning Scheduling

Towards Cooperative Federated Learning over Heterogeneous Edge/Fog Networks

no code implementations15 Mar 2023 Su Wang, Seyyedali Hosseinalipour, Vaneet Aggarwal, Christopher G. Brinton, David J. Love, Weifeng Su, Mung Chiang

Federated learning (FL) has been promoted as a popular technique for training machine learning (ML) models over edge/fog networks.

Federated Learning

Interference Cancellation GAN Framework for Dynamic Channels

no code implementations17 Aug 2022 Hung T. Nguyen, Steven Bottone, Kwang Taik Kim, Mung Chiang, H. Vincent Poor

Symbol detection is a fundamental and challenging problem in modern communication systems, e. g., multiuser multiple-input multiple-output (MIMO) setting.

Multi-Edge Server-Assisted Dynamic Federated Learning with an Optimized Floating Aggregation Point

no code implementations26 Mar 2022 Bhargav Ganguly, Seyyedali Hosseinalipour, Kwang Taik Kim, Christopher G. Brinton, Vaneet Aggarwal, David J. Love, Mung Chiang

CE-FL also introduces floating aggregation point, where the local models generated at the devices and the servers are aggregated at an edge server, which varies from one model training round to another to cope with the network evolution in terms of data distribution and users' mobility.

Distributed Optimization Federated Learning

Contextual Model Aggregation for Fast and Robust Federated Learning in Edge Computing

no code implementations23 Mar 2022 Hung T. Nguyen, H. Vincent Poor, Mung Chiang

However, existing algorithms face issues with slow convergence and/or robustness of performance due to the considerable heterogeneity of data distribution, computation and communication capability at the edge.

Edge-computing Federated Learning

Parallel Successive Learning for Dynamic Distributed Model Training over Heterogeneous Wireless Networks

no code implementations7 Feb 2022 Seyyedali Hosseinalipour, Su Wang, Nicolo Michelusi, Vaneet Aggarwal, Christopher G. Brinton, David J. Love, Mung Chiang

PSL considers the realistic scenario where global aggregations are conducted with idle times in-between them for resource efficiency improvements, and incorporates data dispersion and model dispersion with local model condensation into FedL.

Federated Learning

On-the-fly Resource-Aware Model Aggregation for Federated Learning in Heterogeneous Edge

no code implementations21 Dec 2021 Hung T. Nguyen, Roberto Morabito, Kwang Taik Kim, Mung Chiang

Edge computing has revolutionized the world of mobile and wireless networks world thanks to its flexible, secure, and performing characteristics.

BIG-bench Machine Learning Edge-computing +1

Adversarial Neural Networks for Error Correcting Codes

no code implementations21 Dec 2021 Hung T. Nguyen, Steven Bottone, Kwang Taik Kim, Mung Chiang, H. Vincent Poor

To demonstrate the performance of our framework, we combine it with the very recent neural decoders and show improved performance compared to the original models and traditional decoding algorithms on various codes.

UAV-assisted Online Machine Learning over Multi-Tiered Networks: A Hierarchical Nested Personalized Federated Learning Approach

no code implementations29 Jun 2021 Su Wang, Seyyedali Hosseinalipour, Maria Gorlatova, Christopher G. Brinton, Mung Chiang

The presence of time-varying data heterogeneity and computational resource inadequacy among device clusters motivate four key parts of our methodology: (i) stratified UAV swarms of leader, worker, and coordinator UAVs, (ii) hierarchical nested personalized federated learning (HN-PFL), a distributed ML framework for personalized model training across the worker-leader-core network hierarchy, (iii) cooperative UAV resource pooling to address computational inadequacy of devices by conducting model training among the UAV swarms, and (iv) model/concept drift to model time-varying data distributions.

Decision Making Personalized Federated Learning

Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation

no code implementations4 Jan 2021 Su Wang, Mengyuan Lee, Seyyedali Hosseinalipour, Roberto Morabito, Mung Chiang, Christopher G. Brinton

The conventional federated learning (FedL) architecture distributes machine learning (ML) across worker devices by having them train local models that are periodically aggregated by a server.

Federated Learning Learning Theory

RobustBench: a standardized adversarial robustness benchmark

1 code implementation19 Oct 2020 Francesco Croce, Maksym Andriushchenko, Vikash Sehwag, Edoardo Debenedetti, Nicolas Flammarion, Mung Chiang, Prateek Mittal, Matthias Hein

As a research community, we are still lacking a systematic understanding of the progress on adversarial robustness which often makes it hard to identify the most promising ideas in training robust models.

Adversarial Robustness Benchmarking +3

Fast-Convergent Federated Learning

no code implementations26 Jul 2020 Hung T. Nguyen, Vikash Sehwag, Seyyedali Hosseinalipour, Christopher G. Brinton, Mung Chiang, H. Vincent Poor

In this paper, we propose a fast-convergent federated learning algorithm, called FOLB, which performs intelligent sampling of devices in each round of model training to optimize the expected convergence speed.

BIG-bench Machine Learning Federated Learning

Time for a Background Check! Uncovering the impact of Background Features on Deep Neural Networks

no code implementations24 Jun 2020 Vikash Sehwag, Rajvardhan Oak, Mung Chiang, Prateek Mittal

With increasing expressive power, deep neural networks have significantly improved the state-of-the-art on image classification datasets, such as ImageNet.

Image Classification

From Federated to Fog Learning: Distributed Machine Learning over Heterogeneous Wireless Networks

no code implementations7 Jun 2020 Seyyedali Hosseinalipour, Christopher G. Brinton, Vaneet Aggarwal, Huaiyu Dai, Mung Chiang

There are several challenges with employing conventional federated learning in contemporary networks, due to the significant heterogeneity in compute and communication capabilities that exist across devices.

BIG-bench Machine Learning Federated Learning +1

AppStreamer: Reducing Storage Requirements of Mobile Games through Predictive Streaming

no code implementations16 Dec 2019 Nawanol Theera-Ampornpunt, Shikhar Suryavansh, Sameer Manchanda, Rajesh Panta, Kaustubh Joshi, Mostafa Ammar, Mung Chiang, Saurabh Bagchi

AppStreamer can, therefore, keep only a small part of the files on the device, akin to a "cache", and download the remainder from a cloud storage server or a nearby edge server when it predicts that the app will need them in the near future.

Better the Devil you Know: An Analysis of Evasion Attacks using Out-of-Distribution Adversarial Examples

no code implementations5 May 2019 Vikash Sehwag, Arjun Nitin Bhagoji, Liwei Song, Chawin Sitawarin, Daniel Cullina, Mung Chiang, Prateek Mittal

A large body of recent work has investigated the phenomenon of evasion attacks using adversarial examples for deep learning systems, where the addition of norm-bounded perturbations to the test inputs leads to incorrect output classification.

Autonomous Driving General Classification

An Estimation and Analysis Framework for the Rasch Model

no code implementations ICML 2018 Andrew S. Lan, Mung Chiang, Christoph Studer

The Rasch model is widely used for item response analysis in applications ranging from recommender systems to psychology, education, and finance.

Collaborative Filtering Recommendation Systems

DARTS: Deceiving Autonomous Cars with Toxic Signs

1 code implementation18 Feb 2018 Chawin Sitawarin, Arjun Nitin Bhagoji, Arsalan Mosenia, Mung Chiang, Prateek Mittal

In this paper, we propose and examine security attacks against sign recognition systems for Deceiving Autonomous caRs with Toxic Signs (we call the proposed attacks DARTS).

Traffic Sign Recognition

Linearized Binary Regression

no code implementations1 Feb 2018 Andrew S. Lan, Mung Chiang, Christoph Studer

We showcase the efficacy of our methods and results for a number of synthetic and real-world datasets, which demonstrates that linearized binary regression finds potential use in a variety of inference, estimation, signal processing, and machine learning applications that deal with binary-valued observations or measurements.


Rogue Signs: Deceiving Traffic Sign Recognition with Malicious Ads and Logos

1 code implementation9 Jan 2018 Chawin Sitawarin, Arjun Nitin Bhagoji, Arsalan Mosenia, Prateek Mittal, Mung Chiang

Our attack pipeline generates adversarial samples which are robust to the environmental conditions and noisy image transformations present in the physical world.

Traffic Sign Recognition

Stock Market Prediction from WSJ: Text Mining via Sparse Matrix Factorization

no code implementations27 Jun 2014 Felix Ming Fai Wong, Zhenming Liu, Mung Chiang

We revisit the problem of predicting directional movements of stock prices based on news articles: here our algorithm uses daily articles from The Wall Street Journal to predict the closing stock prices on the same day.

Stock Market Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.