Search Results for author: Nitin H. Vaidya

Found 10 papers, 0 papers with code

Impact of Redundancy on Resilience in Distributed Optimization and Learning

no code implementations16 Nov 2022 Shuo Liu, Nirupam Gupta, Nitin H. Vaidya

In particular, we introduce the notion of $(f, r; \epsilon)$-resilience to characterize how well the true solution is approximated in the presence of up to $f$ Byzantine faulty agents, and up to $r$ slow agents (or stragglers) -- smaller $\epsilon$ represents a better approximation.

Distributed Optimization

Byzantine Fault-Tolerance in Peer-to-Peer Distributed Gradient-Descent

no code implementations28 Jan 2021 Nirupam Gupta, Nitin H. Vaidya

We consider the problem of Byzantine fault-tolerance in the peer-to-peer (P2P) distributed gradient-descent method -- a prominent algorithm for distributed optimization in a P2P system.

Distributed Optimization Distributed, Parallel, and Cluster Computing

Byzantine Fault-Tolerant Distributed Machine Learning Using Stochastic Gradient Descent (SGD) and Norm-Based Comparative Gradient Elimination (CGE)

no code implementations11 Aug 2020 Nirupam Gupta, Shuo Liu, Nitin H. Vaidya

We show that the CGE gradient-filter guarantees fault-tolerance against a bounded fraction of Byzantine agents under standard stochastic assumptions, and is computationally simpler compared to many existing gradient-filters such as multi-KRUM, geometric median-of-means, and the spectral filters.

Randomized Reactive Redundancy for Byzantine Fault-Tolerance in Parallelized Learning

no code implementations19 Dec 2019 Nirupam Gupta, Nitin H. Vaidya

The coding schemes use the concept of reactive redundancy for isolating Byzantine workers that eventually send faulty information.

Byzantine Fault Tolerant Distributed Linear Regression

no code implementations20 Mar 2019 Nirupam Gupta, Nitin H. Vaidya

This paper considers the problem of Byzantine fault tolerance in distributed linear regression in a multi-agent system.

Distributed Optimization regression

Private Learning on Networks: Part II

no code implementations27 Mar 2017 Shripad Gade, Nitin H. Vaidya

This paper considers a distributed multi-agent optimization problem, with the global objective consisting of the sum of local objective functions of the agents.

Distributed Optimization

Private Learning on Networks

no code implementations15 Dec 2016 Shripad Gade, Nitin H. Vaidya

In a distributed machine learning scenario, the dataset is stored among several machines and they solve a distributed optimization problem to collectively learn the underlying model.

BIG-bench Machine Learning Distributed Optimization +1

Distributed Optimization of Convex Sum of Non-Convex Functions

no code implementations18 Aug 2016 Shripad Gade, Nitin H. Vaidya

We present a distributed solution to optimizing a convex function composed of several non-convex functions.

Distributed Optimization

Defending Non-Bayesian Learning against Adversarial Attacks

no code implementations28 Jun 2016 Lili Su, Nitin H. Vaidya

This paper addresses the problem of non-Bayesian learning over multi-agent networks, where agents repeatedly collect partially informative observations about an unknown state of the world, and try to collaboratively learn the true state.

Cannot find the paper you are looking for? You can Submit a new open access paper.