Search Results for author: Seyyedali Hosseinalipour

Found 40 papers, 8 papers with code

Multi-Stage Hybrid Federated Learning over Large-Scale D2D-Enabled Fog Networks

1 code implementation18 Jul 2020 Seyyedali Hosseinalipour, Sheikh Shams Azam, Christopher G. Brinton, Nicolo Michelusi, Vaneet Aggarwal, David J. Love, Huaiyu Dai

We derive the upper bound of convergence for MH-FL with respect to parameters of the network topology (e. g., the spectral radius) and the learning algorithm (e. g., the number of D2D rounds in different clusters).

Federated Learning

Semi-Decentralized Federated Learning with Cooperative D2D Local Model Aggregations

1 code implementation18 Mar 2021 Frank Po-Chen Lin, Seyyedali Hosseinalipour, Sheikh Shams Azam, Christopher G. Brinton, Nicolo Michelusi

Federated learning has emerged as a popular technique for distributing machine learning (ML) model training across the wireless edge.

Federated Learning

Can we Generalize and Distribute Private Representation Learning?

1 code implementation5 Oct 2020 Sheikh Shams Azam, Taejin Kim, Seyyedali Hosseinalipour, Carlee Joe-Wong, Saurabh Bagchi, Christopher Brinton

We study the problem of learning representations that are private yet informative, i. e., provide information about intended "ally" targets while hiding sensitive "adversary" attributes.

Federated Learning Generative Adversarial Network +2

Decentralized Event-Triggered Federated Learning with Heterogeneous Communication Thresholds

1 code implementation7 Apr 2022 Shahryar Zehtabi, Seyyedali Hosseinalipour, Christopher G. Brinton

Through theoretical analysis, we demonstrate that our methodology achieves asymptotic convergence to the globally optimal learning model under standard assumptions in distributed learning and graph consensus literature, and without restrictive connectivity requirements on the underlying topology.

Federated Learning

Decentralized Sporadic Federated Learning: A Unified Methodology with Generalized Convergence Guarantees

1 code implementation5 Feb 2024 Shahryar Zehtabi, Dong-Jun Han, Rohit Parasnis, Seyyedali Hosseinalipour, Christopher G. Brinton

Decentralized Federated Learning (DFL) has received significant recent research attention, capturing settings where both model updates and model aggregations -- the two key FL processes -- are conducted by the clients.

Federated Learning

From Federated to Fog Learning: Distributed Machine Learning over Heterogeneous Wireless Networks

no code implementations7 Jun 2020 Seyyedali Hosseinalipour, Christopher G. Brinton, Vaneet Aggarwal, Huaiyu Dai, Mung Chiang

There are several challenges with employing conventional federated learning in contemporary networks, due to the significant heterogeneity in compute and communication capabilities that exist across devices.

BIG-bench Machine Learning Federated Learning +1

Fast-Convergent Federated Learning

no code implementations26 Jul 2020 Hung T. Nguyen, Vikash Sehwag, Seyyedali Hosseinalipour, Christopher G. Brinton, Mung Chiang, H. Vincent Poor

In this paper, we propose a fast-convergent federated learning algorithm, called FOLB, which performs intelligent sampling of devices in each round of model training to optimize the expected convergence speed.

BIG-bench Machine Learning Federated Learning

A Fast Graph Neural Network-Based Method for Winner Determination in Multi-Unit Combinatorial Auctions

no code implementations29 Sep 2020 Mengyuan Lee, Seyyedali Hosseinalipour, Christopher G. Brinton, Guanding Yu, Huaiyu Dai

However, the problem of allocating items among the bidders to maximize the auctioneers" revenue, i. e., the winner determination problem (WDP), is NP-complete to solve and inapproximable.

Cloud Computing

Lifetime Maximization for UAV-assisted Data Gathering Networks in the Presence of Jamming

no code implementations10 May 2020 Ali Rahmati, Seyyedali Hosseinalipour, Ismail Guvenc, Huaiyu Dai, Arupjyoti Bhuyan

Deployment of unmanned aerial vehicles (UAVs) is recently getting significant attention due to a variety of practical use cases, such as surveillance, data gathering, and commodity delivery.

Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation

no code implementations4 Jan 2021 Su Wang, Mengyuan Lee, Seyyedali Hosseinalipour, Roberto Morabito, Mung Chiang, Christopher G. Brinton

The conventional federated learning (FedL) architecture distributes machine learning (ML) across worker devices by having them train local models that are periodically aggregated by a server.

Federated Learning Learning Theory

Channel Estimation via Successive Denoising in MIMO OFDM Systems: A Reinforcement Learning Approach

no code implementations25 Jan 2021 Myeung Suk Oh, Seyyedali Hosseinalipour, Taejoon Kim, Christopher G. Brinton, David J. Love

Our methodology includes a new successive channel denoising process based on channel curvature computation, for which we obtain a channel curvature magnitude threshold to identify unreliable channel estimates.

Denoising Q-Learning +2

UAV-assisted Online Machine Learning over Multi-Tiered Networks: A Hierarchical Nested Personalized Federated Learning Approach

no code implementations29 Jun 2021 Su Wang, Seyyedali Hosseinalipour, Maria Gorlatova, Christopher G. Brinton, Mung Chiang

The presence of time-varying data heterogeneity and computational resource inadequacy among device clusters motivate four key parts of our methodology: (i) stratified UAV swarms of leader, worker, and coordinator UAVs, (ii) hierarchical nested personalized federated learning (HN-PFL), a distributed ML framework for personalized model training across the worker-leader-core network hierarchy, (iii) cooperative UAV resource pooling to address computational inadequacy of devices by conducting model training among the UAV swarms, and (iv) model/concept drift to model time-varying data distributions.

Decision Making Personalized Federated Learning

Learning-Based Adaptive IRS Control with Limited Feedback Codebooks

no code implementations3 Dec 2021 JungHoon Kim, Seyyedali Hosseinalipour, Andrew C. Marcum, Taejoon Kim, David J. Love, Christopher G. Brinton

We consider a practical setting where (i) the IRS reflection coefficients are achieved by adjusting tunable elements embedded in the meta-atoms, (ii) the IRS reflection coefficients are affected by the incident angles of the incoming signals, (iii) the IRS is deployed in multi-path, time-varying channels, and (iv) the feedback link from the base station to the IRS has a low data rate.

Parallel Successive Learning for Dynamic Distributed Model Training over Heterogeneous Wireless Networks

no code implementations7 Feb 2022 Seyyedali Hosseinalipour, Su Wang, Nicolo Michelusi, Vaneet Aggarwal, Christopher G. Brinton, David J. Love, Mung Chiang

PSL considers the realistic scenario where global aggregations are conducted with idle times in-between them for resource efficiency improvements, and incorporates data dispersion and model dispersion with local model condensation into FedL.

Federated Learning

Latency Optimization for Blockchain-Empowered Federated Learning in Multi-Server Edge Computing

no code implementations18 Mar 2022 Dinh C. Nguyen, Seyyedali Hosseinalipour, David J. Love, Pubudu N. Pathirana, Christopher G. Brinton

To assist the ML model training for resource-constrained MDs, we develop an offloading strategy that enables MDs to transmit their data to one of the associated ESs.

Edge-computing Federated Learning +1

Multi-Edge Server-Assisted Dynamic Federated Learning with an Optimized Floating Aggregation Point

no code implementations26 Mar 2022 Bhargav Ganguly, Seyyedali Hosseinalipour, Kwang Taik Kim, Christopher G. Brinton, Vaneet Aggarwal, David J. Love, Mung Chiang

CE-FL also introduces floating aggregation point, where the local models generated at the devices and the servers are aggregated at an edge server, which varies from one model training round to another to cope with the network evolution in terms of data distribution and users' mobility.

Distributed Optimization Federated Learning

Deep Reinforcement Learning-Based Adaptive IRS Control with Limited Feedback Codebooks

no code implementations7 May 2022 JungHoon Kim, Seyyedali Hosseinalipour, Andrew C. Marcum, Taejoon Kim, David J. Love, Christopher G. Brinton

Intelligent reflecting surfaces (IRS) consist of configurable meta-atoms, which can alter the wireless propagation environment through design of their reflection coefficients.

reinforcement-learning Reinforcement Learning (RL)

Mitigating Biases in Student Performance Prediction via Attention-Based Personalized Federated Learning

no code implementations2 Aug 2022 Yun-Wei Chu, Seyyedali Hosseinalipour, Elizabeth Tenorio, Laura Cruz, Kerrie Douglas, Andrew Lan, Christopher Brinton

To learn better representations of student activity, we augment our approach with a self-supervised behavioral pretraining methodology that leverages multiple modalities of student behavior (e. g., visits to lecture videos and participation on forums), and include a neural network attention mechanism in the model aggregation stage.

Personalized Federated Learning

Event-Triggered Decentralized Federated Learning over Resource-Constrained Edge Devices

no code implementations23 Nov 2022 Shahryar Zehtabi, Seyyedali Hosseinalipour, Christopher G. Brinton

We theoretically demonstrate that our methodology converges to the globally optimal learning model at a $O{(\frac{\ln{k}}{\sqrt{k}})}$ rate under standard assumptions in distributed learning and consensus literature.

Federated Learning

Multi-Layer Personalized Federated Learning for Mitigating Biases in Student Predictive Analytics

no code implementations5 Dec 2022 Yun-Wei Chu, Seyyedali Hosseinalipour, Elizabeth Tenorio, Laura Cruz, Kerrie Douglas, Andrew Lan, Christopher Brinton

Traditional learning-based approaches to student modeling (e. g., predicting grades based on measured activities) generalize poorly to underrepresented/minority student groups due to biases in data availability.

Knowledge Tracing Personalized Federated Learning

A Decentralized Pilot Assignment Algorithm for Scalable O-RAN Cell-Free Massive MIMO

no code implementations12 Jan 2023 Myeung Suk Oh, Anindya Bijoy Das, Seyyedali Hosseinalipour, Taejoon Kim, David J. Love, Christopher G. Brinton

Radio access networks (RANs) in monolithic architectures have limited adaptability to supporting different network scenarios.

Towards Cooperative Federated Learning over Heterogeneous Edge/Fog Networks

no code implementations15 Mar 2023 Su Wang, Seyyedali Hosseinalipour, Vaneet Aggarwal, Christopher G. Brinton, David J. Love, Weifeng Su, Mung Chiang

Federated learning (FL) has been promoted as a popular technique for training machine learning (ML) models over edge/fog networks.

Federated Learning

Delay-Aware Hierarchical Federated Learning

no code implementations22 Mar 2023 Frank Po-Chen Lin, Seyyedali Hosseinalipour, Nicolò Michelusi, Christopher Brinton

The paper introduces delay-aware hierarchical federated learning (DFL) to improve the efficiency of distributed machine learning (ML) model training by accounting for communication delays between edge and cloud.

Federated Learning

Multi-Source to Multi-Target Decentralized Federated Domain Adaptation

no code implementations24 Apr 2023 Su Wang, Seyyedali Hosseinalipour, Christopher G. Brinton

Our methodology, Source-Target Determination and Link Formation (ST-LF), optimizes both (i) classification of devices into sources and targets and (ii) source-target link formation, in a manner that considers the trade-off between ML model accuracy and communication energy efficiency.

Domain Adaptation Federated Learning

Dynamic and Robust Sensor Selection Strategies for Wireless Positioning with TOA/RSS Measurement

no code implementations30 Apr 2023 Myeung Suk Oh, Seyyedali Hosseinalipour, Taejoon Kim, David J. Love, James V. Krogmeier, Christopher G. Brinton

For dynamic sensor selection, two greedy selection strategies are proposed, each of which exploits properties revealed in the derived CRLB expressions.

Asynchronous Multi-Model Dynamic Federated Learning over Wireless Networks: Theory, Modeling, and Optimization

no code implementations22 May 2023 Zhan-Lun Chang, Seyyedali Hosseinalipour, Mung Chiang, Christopher G. Brinton

Our analysis sheds light on the joint impact of device training variables (e. g., number of local gradient descent steps), asynchronous scheduling decisions (i. e., when a device trains a task), and dynamic data drifts on the performance of ML training for different tasks.

Federated Learning Scheduling

Device Sampling and Resource Optimization for Federated Learning in Cooperative Edge Networks

no code implementations7 Nov 2023 Su Wang, Roberto Morabito, Seyyedali Hosseinalipour, Mung Chiang, Christopher G. Brinton

Our optimization methodology aims to select the best combination of sampled nodes and data offloading configuration to maximize FedL training accuracy while minimizing data processing and D2D communication resource consumption subject to realistic constraints on the network topology and device capabilities.

Federated Learning

Cooperative Federated Learning over Ground-to-Satellite Integrated Networks: Joint Local Computation and Data Offloading

no code implementations23 Dec 2023 Dong-Jun Han, Seyyedali Hosseinalipour, David J. Love, Mung Chiang, Christopher G. Brinton

While network coverage maps continue to expand, many devices located in remote areas remain unconnected to terrestrial communication infrastructures, preventing them from getting access to the associated data-driven services.

Federated Learning Management

Coding for Gaussian Two-Way Channels: Linear and Learning-Based Approaches

no code implementations31 Dec 2023 JungHoon Kim, Taejoon Kim, Anindya Bijoy Das, Seyyedali Hosseinalipour, David J. Love, Christopher G. Brinton

In this work, we aim to enhance and balance the communication reliability in GTWCs by minimizing the sum of error probabilities via joint design of encoders and decoders at the users.

Multi-Modal Federated Learning for Cancer Staging over Non-IID Datasets with Unbalanced Modalities

1 code implementation7 Jan 2024 Kasra Borazjani, Naji Khosravan, Leslie Ying, Seyyedali Hosseinalipour

Given the frequent presence of diverse data modalities within patient records, leveraging FL in a multi-modal learning framework holds considerable promise for cancer staging.

Federated Learning

Dynamic D2D-Assisted Federated Learning over O-RAN: Performance Analysis, MAC Scheduler, and Asymmetric User Selection

no code implementations9 Apr 2024 Payam Abdisarabshali, Kwang Taik Kim, Michael Langberg, Weifeng Su, Seyyedali Hosseinalipour

In this paper, we incorporate multi-granular system dynamics (MSDs) into FL, including (M1) dynamic wireless channel capacity, captured by a set of discrete-time events, called $\mathscr{D}$-Events, and (M2) dynamic datasets of users.

Federated Learning

Unsupervised Federated Optimization at the Edge: D2D-Enabled Learning without Labels

no code implementations15 Apr 2024 Satyavrat Wagle, Seyyedali Hosseinalipour, Naji Khosravan, Christopher G. Brinton

Specifically, we introduce a \textit{smart information push-pull} methodology for data/embedding exchange tailored to FL settings with either soft or strict data privacy restrictions.

Contrastive Learning Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.