Search Results for author: Minseok Kim

Found 24 papers, 12 papers with code

NR-Surface: NextG-ready $μ$W-reconfigurable mmWave Metasurface

no code implementations15 Mar 2024 Minseok Kim, Namjo Ahn, Song Min Kim

NR-Surface incorporates (i) a new extremely low-power (14KHz sampling) reconfiguration interface, NarrowBand Packet Unit (NBPU), for synchronization and real-time reconfiguration, and (ii) a highly responsive and low-leakage metasurface designed for low-duty cycled operation, by carefully leveraging the structure and the periodicity of the NR beam management procedure in the NR standard.

CARBD-Ko: A Contextually Annotated Review Benchmark Dataset for Aspect-Level Sentiment Classification in Korean

no code implementations23 Feb 2024 Dongjun Jang, Jean Seo, Sungjoo Byun, Taekyoung Kim, Minseok Kim, Hyopil Shin

In order to tackle these challenges, we introduce CARBD-Ko (a Contextually Annotated Review Benchmark Dataset for Aspect-Based Sentiment Classification in Korean), a benchmark dataset that incorporates aspects and dual-tagged polarities to distinguish between aspect-specific and aspect-agnostic sentiment classification.

Classification Hallucination +2

DaG LLM ver 1.0: Pioneering Instruction-Tuned Language Modeling for Korean NLP

no code implementations23 Nov 2023 Dongjun Jang, Sangah Lee, Sungjoo Byun, Jinwoong Kim, Jean Seo, Minseok Kim, Soyeon Kim, Chaeyoung Oh, Jaeyoon Kim, Hyemi Jo, Hyopil Shin

This paper presents the DaG LLM (David and Goliath Large Language Model), a language model specialized for Korean and fine-tuned through Instruction Tuning across 41 tasks within 13 distinct categories.

Language Modelling Large Language Model

Sound Demixing Challenge 2023 Music Demixing Track Technical Report: TFC-TDF-UNet v3

1 code implementation15 Jun 2023 Minseok Kim, Jun Hyung Lee, Soonyoung Jung

In this report, we present our award-winning solutions for the Music Demixing Track of Sound Demixing Challenge 2023.

Music Source Separation

Debiasing Neighbor Aggregation for Graph Neural Network in Recommender Systems

no code implementations18 Aug 2022 Minseok Kim, Jinoh Oh, Jaeyoung Do, Sungjin Lee

Graph neural networks (GNNs) have achieved remarkable success in recommender systems by representing users and items based on their historical interactions.

Recommendation Systems

Meta-Learning for Online Update of Recommender Systems

1 code implementation19 Mar 2022 Minseok Kim, Hwanjun Song, Yooju Shin, Dongmin Park, Kijung Shin, Jae-Gil Lee

It is featured with an adaptive learning rate for each parameter-interaction pair for inducing a recommender to quickly learn users' up-to-date interest.

Meta-Learning Recommendation Systems

Task-Agnostic Undesirable Feature Deactivation Using Out-of-Distribution Data

1 code implementation NeurIPS 2021 Dongmin Park, Hwanjun Song, Minseok Kim, Jae-Gil Lee

A deep neural network (DNN) has achieved great success in many machine learning tasks by virtue of its high expressive power.

KUIELab-MDX-Net: A Two-Stream Neural Network for Music Demixing

1 code implementation24 Nov 2021 Minseok Kim, Woosung Choi, Jaehwa Chung, Daewon Lee, Soonyoung Jung

This paper proposes a two-stream neural network for music demixing, called KUIELab-MDX-Net, which shows a good balance of performance and required resources.

Music Source Separation Vocal Bursts Valence Prediction

Deep Learning Based Resource Assignment for Wireless Networks

no code implementations27 Sep 2021 Minseok Kim, Hoon Lee, Hongju Lee, Inkyu Lee

This paper studies a deep learning approach for binary assignment problems in wireless networks, which identifies binary variables for permutation matrices.

Music Demixing Challenge 2021

1 code implementation31 Aug 2021 Yuki Mitsufuji, Giorgio Fabbro, Stefan Uhlich, Fabian-Robert Stöter, Alexandre Défossez, Minseok Kim, Woosung Choi, Chin-Yun Yu, Kin-Wai Cheuk

The main differences compared with the past challenges are 1) the competition is designed to more easily allow machine learning practitioners from other disciplines to participate, 2) evaluation is done on a hidden test set created by music professionals dedicated exclusively to the challenge to assure the transparency of the challenge, i. e., the test set is not accessible from anyone except the challenge organizers, and 3) the dataset provides a wider range of music genres and involved a greater number of mixing engineers.

Music Source Separation

AMSS-Net: Audio Manipulation on User-Specified Sources with Textual Queries

1 code implementation28 Apr 2021 Woosung Choi, Minseok Kim, Marco A. Martínez Ramírez, Jaehwa Chung, Soonyoung Jung

This paper proposes a neural network that performs audio transformations to user-specified sources (e. g., vocals) of a given audio track according to a given description while preserving other sources not mentioned in the description.

Robust Learning by Self-Transition for Handling Noisy Labels

no code implementations8 Dec 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

In the seeding phase, the network is updated using all the samples to collect a seed of clean samples.

MORPH

LaSAFT: Latent Source Attentive Frequency Transformation for Conditioned Source Separation

1 code implementation22 Oct 2020 Woosung Choi, Minseok Kim, Jaehwa Chung, Soonyoung Jung

Recent deep-learning approaches have shown that Frequency Transformation (FT) blocks can significantly improve spectrogram-based single-source separation models by capturing frequency patterns.

Music Source Separation

Learning from Noisy Labels with Deep Neural Networks: A Survey

1 code implementation16 Jul 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

Deep learning has achieved remarkable success in numerous domains with help from large amounts of big data.

How does Early Stopping Help Generalization against Label Noise?

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

Carpe Diem, Seize the Samples Uncertain "At the Moment" for Adaptive Batch Selection

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Sundong Kim, Jae-Gil Lee

Compared with existing batch selection methods, the results showed that Recency Bias reduced the test error by up to 20. 97% in a fixed wall-clock training time.

Prestopping: How Does Early Stopping Help Generalization Against Label Noise?

no code implementations25 Sep 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

SELFIE: Refurbishing Unclean Samples for Robust Deep Learning

1 code implementation15 Jun 2019 Hwanjun Song, Minseok Kim, Jae-Gil Lee

Owing to the extremely high expressive power of deep neural networks, their side effect is to totally memorize training data even when the labels are extremely noisy.

Learning with noisy labels

Training IBM Watson using Automatically Generated Question-Answer Pairs

no code implementations12 Nov 2016 Jangho Lee, Gyuwan Kim, Jaeyoon Yoo, Changwoo Jung, Minseok Kim, Sungroh Yoon

Under the assumption that using such an automatically generated dataset could relieve the burden of manual question-answer generation, we tried to use this dataset to train an instance of Watson and checked the training efficiency and accuracy.

Answer Generation Question-Answer-Generation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.