Search Results for author: Ji-Hoon Kim

Found 19 papers, 7 papers with code

Let There Be Sound: Reconstructing High Quality Speech from Silent Videos

no code implementations29 Aug 2023 Ji-Hoon Kim, Jaehun Kim, Joon Son Chung

In this paper, we propose a novel lip-to-speech system that significantly improves the generation quality by alleviating the one-to-many mapping problem from multiple perspectives.

CrossSpeech: Speaker-independent Acoustic Representation for Cross-lingual Speech Synthesis

no code implementations28 Feb 2023 Ji-Hoon Kim, Hong-Sun Yang, Yoon-Cheol Ju, Il-Hwan Kim, Byeong-Yeol Kim

In this paper, we propose CrossSpeech which improves the quality of cross-lingual speech by effectively disentangling speaker and language information in the level of acoustic feature space.

Speech Synthesis

Relation-Aware Language-Graph Transformer for Question Answering

1 code implementation2 Dec 2022 Jinyoung Park, Hyeong Kyu Choi, Juyeon Ko, Hyeonjin Park, Ji-Hoon Kim, Jisu Jeong, KyungMin Kim, Hyunwoo J. Kim

To address these issues, we propose Question Answering Transformer (QAT), which is designed to jointly reason over language and graphs with respect to entity relations in a unified manner.

Question Answering

Accelerating Large-Scale Graph-based Nearest Neighbor Search on a Computational Storage Platform

no code implementations12 Jul 2022 Ji-Hoon Kim, Yeo-Reum Park, Jaeyoung Do, Soo-Young Ji, Joo-Young Kim

In this paper, we propose a computational storage platform that can accelerate a large-scale graph-based nearest neighbor search algorithm based on SmartSSD CSD.

Two-Step Question Retrieval for Open-Domain QA

1 code implementation Findings (ACL) 2022 Yeon Seonwoo, Juhee Son, Jiho Jin, Sang-Woo Lee, Ji-Hoon Kim, Jung-Woo Ha, Alice Oh

These models have shown a significant increase in inference speed, but at the cost of lower QA performance compared to the retriever-reader models.

Retrieval Vocal Bursts Valence Prediction

Demystifying the Neural Tangent Kernel from a Practical Perspective: Can it be trusted for Neural Architecture Search without training?

1 code implementation CVPR 2022 Jisoo Mok, Byunggook Na, Ji-Hoon Kim, Dongyoon Han, Sungroh Yoon

To take such non-linear characteristics into account, we introduce Label-Gradient Alignment (LGA), a novel NTK-based metric whose inherent formulation allows it to capture the large amount of non-linear advantage present in modern neural architectures.

Neural Architecture Search

VoiceMixer: Adversarial Voice Style Mixup

no code implementations NeurIPS 2021 Sang-Hoon Lee, Ji-Hoon Kim, Hyunseung Chung, Seong-Whan Lee

This insufficiency leads to the converted speech containing source speech style or losing source speech content.

Disentanglement Voice Conversion

SUMNAS: Supernet with Unbiased Meta-Features for Neural Architecture Search

no code implementations ICLR 2022 Hyeonmin Ha, Ji-Hoon Kim, Semin Park, Byung-Gon Chun

We propose Supernet with Unbiased Meta-Features for Neural Architecture Search (SUMNAS), a supernet learning strategy based on meta-learning to tackle the knowledge forgetting issue.

Meta-Learning Neural Architecture Search

GC-TTS: Few-shot Speaker Adaptation with Geometric Constraints

no code implementations16 Aug 2021 Ji-Hoon Kim, Sang-Hoon Lee, Ji-Hyun Lee, Hong-Gyu Jung, Seong-Whan Lee

While numerous attempts have been made to the few-shot speaker adaptation system, there is still a gap in terms of speaker similarity to the target speaker depending on the amount of data.

Weakly Supervised Pre-Training for Multi-Hop Retriever

1 code implementation Findings (ACL) 2021 Yeon Seonwoo, Sang-Woo Lee, Ji-Hoon Kim, Jung-Woo Ha, Alice Oh

In multi-hop QA, answering complex questions entails iterative document retrieval for finding the missing entity of the question.


Fre-GAN: Adversarial Frequency-consistent Audio Synthesis

2 code implementations4 Jun 2021 Ji-Hoon Kim, Sang-Hoon Lee, Ji-Hyun Lee, Seong-Whan Lee

Although recent works on neural vocoder have improved the quality of synthesized audio, there still exists a gap between generated and ground-truth audio in frequency space.

Faster and Smarter AutoAugment: Augmentation Policy Search Based on Dynamic Data-Clustering

no code implementations1 Jan 2021 Jonghyun Bae, Ji-Hoon Kim

Data augmentation tuned to datasets and tasks has had great success in various AI applications, such as computer vision, natural language processing, autonomous driving, and bioinformatics.

Autonomous Driving Clustering +1

Context-Aware Answer Extraction in Question Answering

1 code implementation EMNLP 2020 Yeon Seonwoo, Ji-Hoon Kim, Jung-Woo Ha, Alice Oh

With experiments on reading comprehension, we show that BLANC outperforms the state-of-the-art QA models, and the performance gap increases as the number of answer text occurrences increases.

Multi-Task Learning Question Answering +1

HyperTendril: Visual Analytics for User-Driven Hyperparameter Optimization of Deep Neural Networks

no code implementations4 Sep 2020 Heungseok Park, Yoonsoo Nam, Ji-Hoon Kim, Jaegul Choo

HyperTendril takes a novel approach to effectively steering hyperparameter optimization through an iterative, interactive tuning procedure that allows users to refine the search spaces and the configuration of the AutoML method based on their own insights from given results.

Hyperparameter Optimization

VaB-AL: Incorporating Class Imbalance and Difficulty with Variational Bayes for Active Learning

no code implementations CVPR 2021 Jongwon Choi, Kwang Moo Yi, Ji-Hoon Kim, Jinho Choo, Byoungjip Kim, Jin-Yeop Chang, Youngjune Gwon, Hyung Jin Chang

We show that our method can be applied to classification tasks on multiple different datasets -- including one that is a real-world dataset with heavy data imbalance -- significantly outperforming the state of the art.

Active Learning

The Impact of Automatic Pre-annotation in Clinical Note Data Element Extraction - the CLEAN Tool

no code implementations11 Aug 2018 Tsung-Ting Kuo, Jina Huh, Ji-Hoon Kim, Robert El-Kareh, Siddharth Singh, Stephanie Feudjio Feupe, Vincent Kuri, Gordon Lin, Michele E. Day, Lucila Ohno-Machado, Chun-Nan Hsu

Our study introduces CLEAN (CLinical note rEview and ANnotation), a pre-annotation-based cNLP annotation system to improve clinical note annotation of data elements, and comprehensively compares CLEAN with the widely-used annotation system Brat Rapid Annotation Tool (BRAT).

Open-Ended Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.