Search Results for author: Parijat Dube

Found 12 papers, 1 papers with code

Using sequential drift detection to test the API economy

no code implementations9 Nov 2021 Samuel Ackerman, Parijat Dube, Eitan Farchi

It is thus desirable to monitor the usage patterns and identify when the system is used in a way that was never used before.

Machine Learning Model Drift Detection Via Weak Data Slices

no code implementations11 Aug 2021 Samuel Ackerman, Parijat Dube, Eitan Farchi, Orna Raz, Marcel Zalmanovici

Detecting drift in performance of Machine Learning (ML) models is an acknowledged challenge.

Adversarial training in communication constrained federated learning

no code implementations1 Mar 2021 Devansh Shah, Parijat Dube, Supriyo Chakraborty, Ashish Verma

We observe a significant drop in both natural and adversarial accuracies when AT is used in the federated setting as opposed to centralized training.

Federated Learning

Detection of data drift and outliers affecting machine learning model performance over time

no code implementations16 Dec 2020 Samuel Ackerman, Eitan Farchi, Orna Raz, Marcel Zalmanovici, Parijat Dube

Drift is distribution change between the training and deployment data, which is concerning if model performance changes.

Sequential Drift Detection in Deep Learning Classifiers

no code implementations31 Jul 2020 Samuel Ackerman, Parijat Dube, Eitan Farchi

We utilize neural network embeddings to detect data drift by formulating the drift detection within an appropriate sequential decision framework.

Change Detection

Improving the affordability of robustness training for DNNs

no code implementations11 Feb 2020 Sidharth Gupta, Parijat Dube, Ashish Verma

Projected Gradient Descent (PGD) based adversarial training has become one of the most prominent methods for building robust deep neural network models.

P2L: Predicting Transfer Learning for Images and Semantic Relations

no code implementations20 Aug 2019 Bishwaranjan Bhattacharjee, John R. Kender, Matthew Hill, Parijat Dube, Siyu Huo, Michael R. Glass, Brian Belgodere, Sharath Pankanti, Noel Codella, Patrick Watson

We use this measure, which we call "Predict To Learn" ("P2L"), in the two very different domains of images and semantic relations, where it predicts, from a set of "source" models, the one model most likely to produce effective transfer for training a given "target" model.

Transfer Learning

Automatic Labeling of Data for Transfer Learning

no code implementations24 Mar 2019 Parijat Dube, Bishwaranjan Bhattacharjee, Siyu Huo, Patrick Watson, John Kender, Brian Belgodere

Transfer learning uses trained weights from a source model as the initial weightsfor the training of a target dataset.

Transfer Learning

Improving Transferability of Deep Neural Networks

no code implementations30 Jul 2018 Parijat Dube, Bishwaranjan Bhattacharjee, Elisabeth Petit-Bois, Matthew Hill

This is currently addressed by Transfer Learning where one learns the small data set as a transfer task from a larger source dataset.

Small Data Image Classification Transfer Learning

Slow and Stale Gradients Can Win the Race: Error-Runtime Trade-offs in Distributed SGD

no code implementations3 Mar 2018 Sanghamitra Dutta, Gauri Joshi, Soumyadip Ghosh, Parijat Dube, Priya Nagpurkar

Distributed Stochastic Gradient Descent (SGD) when run in a synchronous manner, suffers from delays in waiting for the slowest learners (stragglers).

IBM Deep Learning Service

2 code implementations18 Sep 2017 Bishwaranjan Bhattacharjee, Scott Boag, Chandani Doshi, Parijat Dube, Ben Herta, Vatche Ishakian, K. R. Jayaram, Rania Khalaf, Avesh Krishna, Yu Bo Li, Vinod Muthusamy, Ruchir Puri, Yufei Ren, Florian Rosenberg, Seetharami R. Seelam, Yandong Wang, Jian Ming Zhang, Li Zhang

Deep learning driven by large neural network models is overtaking traditional machine learning methods for understanding unstructured and perceptual data domains such as speech, text, and vision.

Distributed, Parallel, and Cluster Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.