Search Results for author: Yury Pisarchyk

Found 2 papers, 1 papers with code

Efficient Memory Management for Deep Neural Net Inference

2 code implementations10 Jan 2020 Yury Pisarchyk, Juhyun Lee

While deep neural net inference was considered a task for servers only, latest advances in technology allow the task of inference to be moved to mobile and embedded devices, desired for various reasons ranging from latency to privacy.

Management

On-Device Neural Net Inference with Mobile GPUs

no code implementations3 Jul 2019 Juhyun Lee, Nikolay Chirkov, Ekaterina Ignasheva, Yury Pisarchyk, Mogan Shieh, Fabio Riccardi, Raman Sarokin, Andrei Kulik, Matthias Grundmann

On-device inference of machine learning models for mobile phones is desirable due to its lower latency and increased privacy.

Cannot find the paper you are looking for? You can Submit a new open access paper.