Search Results for author: Kyungtae Lim

Found 16 papers, 5 papers with code

X-LLaVA: Optimizing Bilingual Large Vision-Language Alignment

no code implementations18 Mar 2024 Dongjae Shin, HyeonSeok Lim, InHo Won, ChangSu Choi, Minjun Kim, Seungwoo Song, Hangyeol Yoo, Sangmin Kim, Kyungtae Lim

The impressive development of large language models (LLMs) is expanding into the realm of large multimodal models (LMMs), which incorporate multiple types of data beyond text.

BOK-VQA: Bilingual outside Knowledge-Based Visual Question Answering via Graph Representation Pretraining

no code implementations12 Jan 2024 Minjun Kim, Seungwoo Song, Youhan Lee, Haneol Jang, Kyungtae Lim

The current research direction in generative models, such as the recently developed GPT4, aims to find relevant knowledge information for multimodal and multilingual inputs to provide answers.

Question Answering Visual Question Answering

K-UniMorph: Korean Universal Morphology and its Feature Schema

1 code implementation10 May 2023 Eunkyul Leah Jo, Kyuwon Kim, Xihan Wu, Kyungtae Lim, Jungyeul Park, Chulwoo Park

This dataset adopts morphological feature schema from Sylak-Glassman et al. (2015) and Sylak-Glassman (2016) for the Korean language as we extract inflected verb forms from the Sejong morphologically analyzed corpus that is one of the largest annotated corpora for Korean.

Korean Named Entity Recognition Based on Language-Specific Features

no code implementations10 May 2023 Yige Chen, Kyungtae Lim, Jungyeul Park

In the paper, we propose a novel way of improving named entity recognition in the Korean language using its language-specific features.

named-entity-recognition Named Entity Recognition

Yet Another Format of Universal Dependencies for Korean

1 code implementation COLING 2022 Yige Chen, Eunkyul Leah Jo, Yundong Yao, Kyungtae Lim, Miikka Silfverberg, Francis M. Tyers, Jungyeul Park

In this study, we propose a morpheme-based scheme for Korean dependency parsing and adopt the proposed scheme to Universal Dependencies.

Dependency Parsing

SEx BiST: A Multi-Source Trainable Parser with Deep Contextualized Lexical Representations

1 code implementation CONLL 2018 KyungTae Lim, Cheoneum Park, Changki Lee, Thierry Poibeau

We describe the SEx BiST parser (Semantically EXtended Bi-LSTM parser) developed at Lattice for the CoNLL 2018 Shared Task (Multilingual Parsing from Raw Text to Universal Dependencies).

Dependency Parsing Event Extraction

A System for Multilingual Dependency Parsing based on Bidirectional LSTM Feature Representations

no code implementations CONLL 2017 KyungTae Lim, Thierry Poibeau

In this paper, we present our multilingual dependency parser developed for the CoNLL 2017 UD Shared Task dealing with {``}Multilingual Parsing from Raw Text to Universal Dependencies{''}.

Dependency Parsing Multilingual Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.