Search Results for author: Vinay Chakravarthi Gogineni

Found 9 papers, 0 papers with code

Analyzing the Impact of Partial Sharing on the Resilience of Online Federated Learning Against Model Poisoning Attacks

no code implementations19 Mar 2024 Ehsan Lari, Vinay Chakravarthi Gogineni, Reza Arablouei, Stefan Werner

PSO-Fed reduces the communication load by enabling clients to exchange only a fraction of their model estimates with the server at each update round.

Federated Learning Model Poisoning

Efficient Knowledge Deletion from Trained Models through Layer-wise Partial Machine Unlearning

no code implementations12 Mar 2024 Vinay Chakravarthi Gogineni, Esmaeil S. Nadimi

In this method, updates made to the model during training are pruned and stored, subsequently used to forget specific data from trained model.

Machine Unlearning

Smoothing ADMM for Sparse-Penalized Quantile Regression with Non-Convex Penalties

no code implementations4 Sep 2023 Reza Mirzaeifard, Naveen K. D. Venkategowda, Vinay Chakravarthi Gogineni, Stefan Werner

This paper investigates quantile regression in the presence of non-convex and non-smooth sparse penalties, such as the minimax concave penalty (MCP) and smoothly clipped absolute deviation (SCAD).

regression

Distributed Filtering Design with Enhanced Resilience to Coordinated Byzantine Attacks

no code implementations26 Jul 2023 Ashkan Moradi, Vinay Chakravarthi Gogineni, Naveen K. D. Venkategowda, Stefan Werner

Numerical results demonstrate the accuracy of the proposed BR-CDF and its robustness against Byzantine attacks.

Personalized Graph Federated Learning with Differential Privacy

no code implementations10 Jun 2023 Francois Gauthier, Vinay Chakravarthi Gogineni, Stefan Werner, Yih-Fang Huang, Anthony Kuh

Further, our analysis shows that the algorithm ensures local differential privacy for all clients in terms of zero-concentrated differential privacy.

Federated Learning Privacy Preserving

Asynchronous Online Federated Learning with Reduced Communication Requirements

no code implementations27 Mar 2023 Francois Gauthier, Vinay Chakravarthi Gogineni, Stefan Werner, Yih-Fang Huang, Anthony Kuh

The simulations reveal that in asynchronous settings, the proposed PAO-Fed is able to achieve the same convergence properties as that of the online federated stochastic gradient while reducing the communication overhead by 98 percent.

Federated Learning

Resource-Aware Asynchronous Online Federated Learning for Nonlinear Regression

no code implementations27 Nov 2021 Francois Gauthier, Vinay Chakravarthi Gogineni, Stefan Werner, Yih-Fang Huang, Anthony Kuh

In this manner, we reduce the communication load of the participants and, therefore, render participation in the learning task more accessible.

Federated Learning regression

Communication-Efficient Online Federated Learning Framework for Nonlinear Regression

no code implementations13 Oct 2021 Vinay Chakravarthi Gogineni, Stefan Werner, Yih-Fang Huang, Anthony Kuh

As a solution, this paper presents a partial-sharing-based online federated learning framework (PSO-Fed) that enables clients to update their local models using continuous streaming data and share only portions of those updated models with the server.

Federated Learning regression

Diffusion Adaptation Over Clustered Multitask Networks Based on the Affine Projection Algorithm

no code implementations29 Jul 2015 Vinay Chakravarthi Gogineni, Mrityunjoy Chakraborty

The performance analysis of the proposed multi-task diffusion APA algorithm is studied in mean and mean square sense.

Cannot find the paper you are looking for? You can Submit a new open access paper.