Search Results for author: Joo Seong Jeong

Found 5 papers, 1 papers with code

Accelerating Multi-Model Inference by Merging DNNs of Different Weights

no code implementations28 Sep 2020 Joo Seong Jeong, Soojeong Kim, Gyeong-In Yu, Yunseong Lee, Byung-Gon Chun

Standardized DNN models that have been proved to perform well on machine learning tasks are widely used and often adopted as-is to solve downstream tasks, forming the transfer learning paradigm.

Transfer Learning

Hippo: Taming Hyper-parameter Optimization of Deep Learning with Stage Trees

no code implementations22 Jun 2020 Ahnjae Shin, Do Yoon Kim, Joo Seong Jeong, Byung-Gon Chun

Hyper-parameter optimization is crucial for pushing the accuracy of a deep learning model to its limits.

JANUS: Fast and Flexible Deep Learning via Symbolic Graph Execution of Imperative Programs

no code implementations4 Dec 2018 Eunji Jeong, Sungwoo Cho, Gyeong-In Yu, Joo Seong Jeong, Dong-Jin Shin, Byung-Gon Chun

The rapid evolution of deep neural networks is demanding deep learning (DL) frameworks not only to satisfy the requirement of quickly executing large computations, but also to support straightforward programming models for quickly implementing and experimenting with complex network structures.

Improving the Expressiveness of Deep Learning Frameworks with Recursion

no code implementations4 Sep 2018 Eunji Jeong, Joo Seong Jeong, Soojeong Kim, Gyeong-In Yu, Byung-Gon Chun

Recursive neural networks have widely been used by researchers to handle applications with recursively or hierarchically structured data.

Parallax: Automatic Data-Parallel Training of Deep Neural Networks

1 code implementation8 Aug 2018 Soojeong Kim, Gyeong-In Yu, Hojin Park, Sungwoo Cho, Eunji Jeong, Hyeonmin Ha, Sanha Lee, Joo Seong Jeong, Byung-Gon Chun

The employment of high-performance servers and GPU accelerators for training deep neural network models have greatly accelerated recent advances in machine learning (ML).

Distributed, Parallel, and Cluster Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.