no code implementations • LREC 2022 • Arda Akdemir, Yeojoo Jeon, Tetsuo Shibuya
We use the dataset to train a BERT-based language model, DPRK-BERT.
no code implementations • 1 Dec 2021 • Arda Akdemir, Yeojoo Jeon
We achieve this by compiling the first unlabeled corpus for the DPRK language and fine-tuning a preexisting the ROK language model.
no code implementations • 1 Nov 2020 • Arda Akdemir, Tetsuo Shibuya
Besides, we propose combining transfer learning and multi-task learning to improve the performance of biomedical named entity recognition systems, which is not applied before to the best of our knowledge.
no code implementations • 1 Aug 2020 • Ali Hürriyetoğlu, Erdem Yörük, Deniz Yüret, Çağrı Yoltar, Burak Gürel, Fırat Duruşan, Osman Mutlu, Arda Akdemir
We present an overview of the CLEF-2019 Lab ProtestNews on Extracting Protests from News in the context of generalizable natural language processing.
no code implementations • ACL 2020 • Arda Akdemir
Deep neural network based machine learning models are shown to perform poorly on unseen or out-of-domain examples by numerous recent studies.
no code implementations • 25 Apr 2020 • Arda Akdemir, Tetsuo Shibuya, Tunga Güngör
In this study, we propose using subword contextual embeddings to capture the morphological information for languages with rich morphology.