no code implementations • 28 Oct 2022 • Seungeun Oh, Jihong Park, Sihun Baek, Hyelin Nam, Praneeth Vepakomma, Ramesh Raskar, Mehdi Bennis, Seong-Lyun Kim
Split learning (SL) detours this by communicating smashed data at a cut-layer, yet suffers from data privacy leakage and large communication costs caused by high similarity between ViT' s smashed data and input data.
no code implementations • 8 Aug 2022 • Saiteja Utpala, Praneeth Vepakomma, Nina Miolane
In that spirit, the only geometric statistical query for which a differential privacy mechanism has been developed, so far, is for the release of the sample Fr\'echet mean: the \emph{Riemannian Laplace mechanism} was recently proposed to privatize the Fr\'echet mean on complete Riemannian manifolds.
no code implementations • 8 Jul 2022 • Praneeth Vepakomma, Mohammad Mohammadi Amiri, Clément L. Canonne, Ramesh Raskar, Alex Pentland
We introduce $\pi$-test, a privacy-preserving algorithm for testing statistical independence between data distributed across multiple parties.
no code implementations • 1 Jul 2022 • Sihun Baek, Jihong Park, Praneeth Vepakomma, Ramesh Raskar, Mehdi Bennis, Seong-Lyun Kim
Leveraging this, we develop a novel SL framework for ViT, coined CutMixSL, communicating CutSmashed data.
no code implementations • 17 Mar 2022 • Abhishek Singh, Ethan Garza, Ayush Chopra, Praneeth Vepakomma, Vivek Sharma, Ramesh Raskar
While releasing datasets continues to make a big impact in various applications of computer vision, its impact is mostly realized when data sharing is not inhibited by privacy concerns.
no code implementations • 11 Dec 2021 • Shraman Pal, Mansi Uniyal, Jihong Park, Praneeth Vepakomma, Ramesh Raskar, Mehdi Bennis, Moongu Jeon, Jinho Choi
In recent years, there have been great advances in the field of decentralized learning with private data.
no code implementations • 2 Dec 2021 • Ayush Chopra, Surya Kant Sahu, Abhishek Singh, Abhinav Java, Praneeth Vepakomma, Vivek Sharma, Ramesh Raskar
In this work, we introduce AdaSplit which enables efficiently scaling SL to low resource scenarios by reducing bandwidth consumption and improving performance across heterogeneous clients.
no code implementations • 19 Oct 2021 • Praneeth Vepakomma, Subha Nawer Pushpita, Ramesh Raskar
We introduce a differentially private method to measure nonlinear correlations between sensitive data hosted across two entities.
no code implementations • 29 Sep 2021 • Abhishek Singh, Ethan Garza, Ayush Chopra, Praneeth Vepakomma, Vivek Sharma, Ramesh Raskar
This is done in a two-step process: first, we develop a method that encodes unstructured image-like modality into a structured representation bifurcated by sensitive and non-sensitive representation.
no code implementations • 19 Aug 2021 • Praneeth Vepakomma, Yulia Kempner, Ramesh Raskar
We provide a parallel algorithm with a time complexity over $n$ processors of $\mathcal{O}(n^2g) +\mathcal{O}(\log{\log{n}})$ where $n$ is the cardinality of the ground set and $g$ is the complexity to compute the monotone linkage function that induces a corresponding quasi-concave set function via a duality.
no code implementations • 2 May 2021 • Yusuke Koda, Jihong Park, Mehdi Bennis, Praneeth Vepakomma, Ramesh Raskar
In AirMixML, multiple workers transmit analog-modulated signals of their private data samples to an edge server who trains an ML model using the received noisy-and superpositioned samples.
no code implementations • 22 Feb 2021 • Praneeth Vepakomma, Julia Balla, Ramesh Raskar
1) We present a novel differentially private method \textit{PrivateMail} for supervised manifold learning, the first of its kind to our knowledge.
no code implementations • CVPR 2021 • Abhishek Singh, Ayush Chopra, Vivek Sharma, Ethan Garza, Emily Zhang, Praneeth Vepakomma, Ramesh Raskar
Recent deep learning models have shown remarkable performance in image classification.
1 code implementation • 20 Aug 2020 • Praneeth Vepakomma, Abhishek Singh, Otkrist Gupta, Ramesh Raskar
For distributed machine learning with sensitive data, we demonstrate how minimizing distance correlation between raw data and intermediary representations reduces leakage of sensitive raw data patterns across client communications while maintaining model accuracy.
no code implementations • 7 Aug 2020 • Iker Ceballos, Vivek Sharma, Eduardo Mugica, Abhishek Singh, Alberto Roman, Praneeth Vepakomma, Ramesh Raskar
In this work, we introduce SplitNN-driven Vertical Partitioning, a configuration of a distributed deep learning method called SplitNN to facilitate learning from vertically distributed features.
4 code implementations • 27 Jul 2020 • Chaoyang He, Songze Li, Jinhyun So, Xiao Zeng, Mi Zhang, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Xinghua Zhu, Jianzong Wang, Li Shen, Peilin Zhao, Yan Kang, Yang Liu, Ramesh Raskar, Qiang Yang, Murali Annavaram, Salman Avestimehr
Federated learning (FL) is a rapidly growing research field in machine learning.
no code implementations • 6 Jul 2020 • Praneeth Vepakomma, Julia Balla, Ramesh Raskar
Performing computations while maintaining privacy is an important problem in todays distributed machine learning solutions.
no code implementations • 25 Apr 2020 • Fatemehsadat Mireshghallah, Mohammadkazem Taram, Praneeth Vepakomma, Abhishek Singh, Ramesh Raskar, Hadi Esmaeilzadeh
In this survey, we review the privacy concerns brought by deep learning, and the mitigating techniques introduced to tackle these issues.
1 code implementation • 19 Mar 2020 • Ramesh Raskar, Isabel Schunemann, Rachel Barbar, Kristen Vilcans, Jim Gray, Praneeth Vepakomma, Suraj Kapa, Andrea Nuzzo, Rajiv Gupta, Alex Berke, Dazza Greenwood, Christian Keegan, Shriank Kanaparti, Robson Beaudry, David Stansbury, Beatriz Botero Arcila, Rishank Kanaparti, Francesco M Benedetti, Alina Clough, Riddhiman Das, Kaushal Jain, Khahlil Louisy, Greg Nadeau, Vitor Pamplona, Steve Penrod, Yasaman Rajaee, Abhishek Singh, Greg Storm, John Werner
Containment, the key strategy in quickly halting an epidemic, requires rapid identification and quarantine of the infected individuals, determination of whom they have had close contact with in the previous days and weeks, and decontamination of locations the infected individual has visited.
Cryptography and Security Computers and Society Distributed, Parallel, and Cluster Computing
no code implementations • 27 Dec 2019 • Maarten G. Poirot, Praneeth Vepakomma, Ken Chang, Jayashree Kalpathy-Cramer, Rajiv Gupta, Ramesh Raskar
Shortage of labeled data has been holding the surge of deep learning in healthcare back, as sample sizes are often small, patient information cannot be shared openly, and multi-center collaborative studies are a burden to set up.
7 code implementations • 10 Dec 2019 • Peter Kairouz, H. Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, Rafael G. L. D'Oliveira, Hubert Eichner, Salim El Rouayheb, David Evans, Josh Gardner, Zachary Garrett, Adrià Gascón, Badih Ghazi, Phillip B. Gibbons, Marco Gruteser, Zaid Harchaoui, Chaoyang He, Lie He, Zhouyuan Huo, Ben Hutchinson, Justin Hsu, Martin Jaggi, Tara Javidi, Gauri Joshi, Mikhail Khodak, Jakub Konečný, Aleksandra Korolova, Farinaz Koushanfar, Sanmi Koyejo, Tancrède Lepoint, Yang Liu, Prateek Mittal, Mehryar Mohri, Richard Nock, Ayfer Özgür, Rasmus Pagh, Mariana Raykova, Hang Qi, Daniel Ramage, Ramesh Raskar, Dawn Song, Weikang Song, Sebastian U. Stich, Ziteng Sun, Ananda Theertha Suresh, Florian Tramèr, Praneeth Vepakomma, Jianyu Wang, Li Xiong, Zheng Xu, Qiang Yang, Felix X. Yu, Han Yu, Sen Zhao
FL embodies the principles of focused data collection and minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science approaches.
no code implementations • 9 Oct 2019 • Vivek Sharma, Praneeth Vepakomma, Tristan Swedish, Ken Chang, Jayashree Kalpathy-Cramer, Ramesh Raskar
Recently, there has been the development of Split Learning, a framework for distributed computation where model components are split between the client and server (Vepakomma et al., 2018b).
no code implementations • 5 Oct 2019 • Vivek Sharma, Praneeth Vepakomma, Tristan Swedish, Ken Chang, Jayashree Kalpathy-Cramer, Ramesh Raskar
In this work we introduce ExpertMatcher, a method for automating deep learning model selection using autoencoders.
no code implementations • 27 Sep 2019 • Indu Ilanchezian, Praneeth Vepakomma, Abhishek Singh, Otkrist Gupta, G. N. Srinivasa Prasanna, Ramesh Raskar
In this paper we investigate the usage of adversarial perturbations for the purpose of privacy from human perception and model (machine) based detection.
no code implementations • 18 Sep 2019 • Abhishek Singh, Praneeth Vepakomma, Otkrist Gupta, Ramesh Raskar
We compare communication efficiencies of two compelling distributed machine learning approaches of split learning and federated learning.
no code implementations • 14 May 2019 • Ramesh Raskar, Praneeth Vepakomma, Tristan Swedish, Aalekh Sharan
We discuss a data market technique based on intrinsic (relevance and uniqueness) as well as extrinsic value (influenced by supply and demand) of data.
no code implementations • 8 Dec 2018 • Praneeth Vepakomma, Tristan Swedish, Ramesh Raskar, Otkrist Gupta, Abhimanyu Dubey
We survey distributed deep learning models for training or inference without accessing raw data from clients.
1 code implementation • 6 Dec 2018 • Sai Sri Sathya, Praneeth Vepakomma, Ramesh Raskar, Ranjan Ramachandra, Santanu Bhattacharya
In this paper we provide a survey of various libraries for homomorphic encryption.
Cryptography and Security
1 code implementation • 3 Dec 2018 • Praneeth Vepakomma, Otkrist Gupta, Tristan Swedish, Ramesh Raskar
Can health entities collaboratively train deep learning models without sharing sensitive raw data?
no code implementations • 28 Dec 2016 • Susovan Pal, Praneeth Vepakomma
We provide a way to infer about existence of topological circularity in high-dimensional data sets in $\mathbb{R}^d$ from its projection in $\mathbb{R}^2$ obtained through a fast manifold learning map as a function of the high-dimensional dataset $\mathbb{X}$ and a particular choice of a positive real $\sigma$ known as bandwidth parameter.
no code implementations • 3 Jan 2016 • Praneeth Vepakomma, Chetan Tonde, Ahmed Elgammal
In our work, we propose a novel formulation for supervised dimensionality reduction based on a nonlinear dependency criterion called Statistical Distance Correlation, Szekely et.
no code implementations • 11 Jun 2013 • Praneeth Vepakomma, Ahmed Elgammal
Our setting is different from subset-selection algorithms where the problem is to choose the best subset of features for regression.