Search Results for author: Yogish Sabharwal

Found 8 papers, 2 papers with code

Efficient Inferencing of Compressed Deep Neural Networks

no code implementations1 Nov 2017 Dharma Teja Vooturi, Saurabh Goyal, Anamitra R. Choudhury, Yogish Sabharwal, Ashish Verma

Large number of weights in deep neural networks makes the models difficult to be deployed in low memory environments such as, mobile phones, IOT edge devices as well as "inferencing as a service" environments on cloud.

Quantization

Effective Elastic Scaling of Deep Learning Workloads

no code implementations24 Jun 2020 Vaibhav Saxena, K. R. Jayaram, Saurav Basu, Yogish Sabharwal, Ashish Verma

We design a fast dynamic programming based optimizer to solve this problem in real-time to determine jobs that can be scaled up/down, and use this optimizer in an autoscaler to dynamically change the allocated resources and batch sizes of individual DL jobs.

Scheduling Resources for Executing a Partial Set of Jobs

no code implementations10 Oct 2012 Venkatesan Chakaravarthy, Arindam Pal, Sambuddha Roy, Yogish Sabharwal

In this paper, we consider the problem of choosing a minimum cost set of resources for executing a specified set of jobs.

Data Structures and Algorithms

Efficient Scaling of Dynamic Graph Neural Networks

no code implementations16 Sep 2021 Venkatesan T. Chakaravarthy, Shivmaran S. Pandian, Saurabh Raje, Yogish Sabharwal, Toyotaro Suzumura, Shashanka Ubaru

We present distributed algorithms for training dynamic Graph Neural Networks (GNN) on large scale graphs spanning multi-node, multi-GPU systems.

Compression of Deep Neural Networks by combining pruning and low rank decomposition

no code implementations20 Oct 2018 Saurabh Goyal, Anamitra R Choudhury, Vivek Sharma, Yogish Sabharwal, Ashish Verma

Large number of weights in deep neural networks make the models difficult to be deployed in low memory environments such as, mobile phones, IOT edge devices as well as "inferencing as a service" environments on the cloud.

Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.