Search Results for author: Sunghyun Kim

Found 4 papers, 0 papers with code

Efficient Strong Scaling Through Burst Parallel Training

no code implementations19 Dec 2021 Seo Jin Park, Joshua Fried, Sunghyun Kim, Mohammad Alizadeh, Adam Belay

As emerging deep neural network (DNN) models continue to grow in size, using large GPU clusters to train DNNs is becoming an essential requirement to achieving acceptable training times.

Match prediction from group comparison data using neural networks

no code implementations25 Sep 2019 Sunghyun Kim, Minje Jang, Changho Suh

As existing state-of-the-art algorithms are tailored to certain statistical models, we have different best algorithms across distinct scenarios.

Optimal Sample Complexity of M-wise Data for Top-K Ranking

no code implementations NeurIPS 2017 Minje Jang, Sunghyun Kim, Changho Suh, Sewoong Oh

As our result, we characterize the minimax optimality on the sample size for top-K ranking.

Top-$K$ Ranking from Pairwise Comparisons: When Spectral Ranking is Optimal

no code implementations14 Mar 2016 Minje Jang, Sunghyun Kim, Changho Suh, Sewoong Oh

First, in a general comparison model where item pairs to compare are given a priori, we attain an upper and lower bound on the sample size for reliable recovery of the top-$K$ ranked items.

Cannot find the paper you are looking for? You can Submit a new open access paper.