Search Results for author: Kang-Min Kim

Found 8 papers, 2 papers with code

Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking

1 code implementation15 Dec 2022 Mingyu Lee, Jun-Hyung Park, Junho Kim, Kang-Min Kim, SangKeun Lee

Masked language modeling (MLM) has been widely used for pre-training effective bidirectional representations, but incurs substantial training costs.

Language Modelling Masked Language Modeling

Adaptive Compression of Word Embeddings

no code implementations ACL 2020 Yeachan Kim, Kang-Min Kim, SangKeun Lee

However, unlike prior works that assign the same length of codes to all words, we adaptively assign different lengths of codes to each word by learning downstream tasks.

Self-Driving Cars Word Embeddings

Representation Learning for Unseen Words by Bridging Subwords to Semantic Networks

no code implementations LREC 2020 Yeachan Kim, Kang-Min Kim, SangKeun Lee

In the first stage, we learn subword embeddings from the pre-trained word embeddings by using an additive composition function of subwords.

Representation Learning Word Embeddings

Learning to Generate Word Representations using Subword Information

no code implementations COLING 2018 Yeachan Kim, Kang-Min Kim, Ji-Min Lee, SangKeun Lee

Unlike previous models that learn word representations from a large corpus, we take a set of pre-trained word embeddings and generalize it to word entries, including OOV words.

Chunking Language Modelling +5

Cannot find the paper you are looking for? You can Submit a new open access paper.