no code implementations • LREC 2022 • Arda Akdemir, Yeojoo Jeon, Tetsuo Shibuya
We use the dataset to train a BERT-based language model, DPRK-BERT.
1 code implementation • 12 Dec 2023 • Quentin Hillebrand, Vorapong Suppakitpaisarn, Tetsuo Shibuya
We suggest the use of hash functions to cut down the communication costs when counting subgraphs under edge local differential privacy.
no code implementations • 1 Nov 2020 • Arda Akdemir, Tetsuo Shibuya
Besides, we propose combining transfer learning and multi-task learning to improve the performance of biomedical named entity recognition systems, which is not applied before to the best of our knowledge.
no code implementations • 25 Apr 2020 • Arda Akdemir, Tetsuo Shibuya, Tunga Güngör
In this study, we propose using subword contextual embeddings to capture the morphological information for languages with rich morphology.