no code implementations • SemEval (NAACL) 2022 • Gaku Morio, Hiroaki Ozaki, Atsuki Yamaguchi, Yasuhiro Sogawa
In this task, we have to parse opinions considering both structure- and context-dependent subjective aspects, which is different from typical dependency parsing.
no code implementations • SemEval (NAACL) 2022 • Atsuki Yamaguchi, Gaku Morio, Hiroaki Ozaki, Yasuhiro Sogawa
In this paper, we describe our system for SemEval-2022 Task 2: Multilingual Idiomaticity Detection and Sentence Embedding.
1 code implementation • 2 Oct 2023 • Atsuki Yamaguchi, Terufumi Morishita
We present appjsonify, a Python-based PDF-to-JSON conversion toolkit for academic papers.
1 code implementation • 11 Aug 2023 • Terufumi Morishita, Gaku Morio, Atsuki Yamaguchi, Yasuhiro Sogawa
We rethink this and adopt a well-grounded set of deduction rules based on formal logic theory, which can derive any other deduction rules when combined in a multistep way.
1 code implementation • 16 Jun 2023 • Takuro Fujii, Koki Shibata, Atsuki Yamaguchi, Terufumi Morishita, Yasuhiro Sogawa
This paper investigates the effect of tokenizers on the downstream performance of pretrained language models (PLMs) in scriptio continua languages where no explicit spaces exist between words, using Japanese as a case study.
1 code implementation • 18 May 2023 • Atsuki Yamaguchi, Hiroaki Ozaki, Terufumi Morishita, Gaku Morio, Yasuhiro Sogawa
Masked language modeling (MLM) is a widely used self-supervised pretraining objective, where a model needs to predict an original token that is replaced with a mask given contexts.
no code implementations • 3 Mar 2023 • Yuta Koreeda, Ken-ichi Yokote, Hiroaki Ozaki, Atsuki Yamaguchi, Masaya Tsunokake, Yasuhiro Sogawa
Based on the multilingual, multi-task nature of the task and the low-resource setting, we investigated different cross-lingual and multi-task strategies for training the pretrained language models.
no code implementations • 6 Dec 2021 • Atsuki Yamaguchi, Gaku Morio, Hiroaki Ozaki, Ken-ichi Yokote, Kenji Nagamatsu
This paper introduces the proposed automatic minuting system of the Hitachi team for the First Shared Task on Automatic Minuting (AutoMin-2021).
1 code implementation • EMNLP 2021 • Atsuki Yamaguchi, George Chrysostomou, Katerina Margatina, Nikolaos Aletras
Masked language modeling (MLM), a self-supervised pretraining objective, is widely used in natural language processing for learning text representations.
1 code implementation • EACL 2021 • Atsuki Yamaguchi, Kosui Iwasa, Katsuhide Fujita
Thanks to the success of goal-oriented negotiation dialogue systems, studies of negotiation dialogue have gained momentum in terms of both human-human negotiation support and dialogue systems.