Search Results for author: Arpita Gang

Found 4 papers, 1 papers with code

FAST-PCA: A Fast and Exact Algorithm for Distributed Principal Component Analysis

no code implementations27 Aug 2021 Arpita Gang, Waheed U. Bajwa

While PCA is often thought of as a dimensionality reduction method, the purpose of PCA is actually two-fold: dimension reduction and uncorrelated feature learning.

Dimensionality Reduction

Distributed Principal Subspace Analysis for Partitioned Big Data: Algorithms, Analysis, and Implementation

no code implementations11 Mar 2021 Arpita Gang, Bingqing Xiang, Waheed U. Bajwa

This has led to the study of distributed PSA/PCA solutions, in which the data are partitioned across multiple machines and an estimate of the principal subspace is obtained through collaboration among the machines.

Dimensionality Reduction

A Linearly Convergent Algorithm for Distributed Principal Component Analysis

1 code implementation5 Jan 2021 Arpita Gang, Waheed U. Bajwa

This paper focuses on the dual objective of PCA, namely, dimensionality reduction and decorrelation of features, but in a distributed setting.

Dimensionality Reduction

Adversary-resilient Distributed and Decentralized Statistical Inference and Machine Learning: An Overview of Recent Advances Under the Byzantine Threat Model

no code implementations23 Aug 2019 Zhixiong Yang, Arpita Gang, Waheed U. Bajwa

While the last few decades have witnessed a huge body of work devoted to inference and learning in distributed and decentralized setups, much of this work assumes a non-adversarial setting in which individual nodes---apart from occasional statistical failures---operate as intended within the algorithmic framework.

Decision Making

Cannot find the paper you are looking for? You can Submit a new open access paper.