Search Results for author: Nageen Himayat

Found 9 papers, 0 papers with code

Investigating the Adversarial Robustness of Density Estimation Using the Probability Flow ODE

no code implementations10 Oct 2023 Marius Arvinte, Cory Cornelius, Jason Martin, Nageen Himayat

Beyond their impressive sampling capabilities, score-based diffusion models offer a powerful analysis tool in the form of unbiased density estimation of a query sample under the training data distribution.

Adversarial Robustness Density Estimation

Resource-Efficient Federated Hyperdimensional Computing

no code implementations2 Jun 2023 Nikita Zeulin, Olga Galinina, Nageen Himayat, Sergey Andreev

In conventional federated hyperdimensional computing (HDC), training larger models usually results in higher predictive performance but also requires more computational, communication, and energy resources.

Multi-Task Model Personalization for Federated Supervised SVM in Heterogeneous Networks

no code implementations17 Mar 2023 Aleksei Ponomarenko-Timofeev, Olga Galinina, Ravikumar Balakrishnan, Nageen Himayat, Sergey Andreev, Yevgeni Koucheryavy

The proposed method utilizes efficient computations and model exchange in a network of heterogeneous nodes and allows personalization of the learning model in the presence of non-i. i. d.

Multi-Task Learning regression

Streaming Encoding Algorithms for Scalable Hyperdimensional Computing

no code implementations20 Sep 2022 Anthony Thomas, Behnam Khaleghi, Gopi Krishna Jha, Sanjoy Dasgupta, Nageen Himayat, Ravi Iyer, Nilesh Jain, Tajana Rosing

Hyperdimensional computing (HDC) is a paradigm for data representation and learning originating in computational neuroscience.

Dynamic Network-Assisted D2D-Aided Coded Distributed Learning

no code implementations26 Nov 2021 Nikita Zeulin, Olga Galinina, Nageen Himayat, Sergey Andreev, Robert W. Heath Jr

Today, various machine learning (ML) applications offer continuous data processing and real-time data analytics at the edge of a wireless network.

Federated Learning

Diverse Client Selection for Federated Learning via Submodular Maximization

no code implementations ICLR 2022 Ravikumar Balakrishnan, Tian Li, Tianyi Zhou, Nageen Himayat, Virginia Smith, Jeff Bilmes

In every communication round of federated learning, a random subset of clients communicate their model updates back to the server which then aggregates them all.

Fairness Federated Learning

Coded Computing for Low-Latency Federated Learning over Wireless Edge Networks

no code implementations12 Nov 2020 Saurav Prakash, Sagar Dhakal, Mustafa Akdeniz, Yair Yona, Shilpa Talwar, Salman Avestimehr, Nageen Himayat

For minimizing the epoch deadline time at the MEC server, we provide a tractable approach for finding the amount of coding redundancy and the number of local data points that a client processes during training, by exploiting the statistical properties of compute as well as communication delays.

Edge-computing Federated Learning

Coded Computing for Federated Learning at the Edge

no code implementations7 Jul 2020 Saurav Prakash, Sagar Dhakal, Mustafa Akdeniz, A. Salman Avestimehr, Nageen Himayat

Federated Learning (FL) is an exciting new paradigm that enables training a global model from data generated locally at the client nodes, without moving client data to a centralized server.

Edge-computing Federated Learning +1

Coded Federated Learning

no code implementations21 Feb 2020 Sagar Dhakal, Saurav Prakash, Yair Yona, Shilpa Talwar, Nageen Himayat

Here, model parameters are computed locally by each client device and exchanged with a central server, which aggregates the local models for a global view, without requiring sharing of training data.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.