Search Results for author: Yuichiroh Matsubayashi

Found 15 papers, 2 papers with code

To Drop or Not to Drop? Predicting Argument Ellipsis Judgments: A Case Study in Japanese

no code implementations17 Apr 2024 Yukiko Ishizuki, Tatsuki Kuribayashi, Yuichiroh Matsubayashi, Ryohei Sasano, Kentaro Inui

Speakers sometimes omit certain arguments of a predicate in a sentence; such omission is especially frequent in pro-drop languages.

Language Modelling Sentence

Japanese-English Sentence Translation Exercises Dataset for Automatic Grading

no code implementations6 Mar 2024 Naoki Miura, Hiroaki Funayama, Seiya Kikuchi, Yuichiroh Matsubayashi, Yuya Iwase, Kentaro Inui

Using this dataset, we demonstrate the performance of baselines including finetuned BERT and GPT models with few-shot in-context learning.

Few-Shot Learning In-Context Learning +2

Balancing Cost and Quality: An Exploration of Human-in-the-loop Frameworks for Automated Short Answer Scoring

no code implementations16 Jun 2022 Hiroaki Funayama, Tasuku Sato, Yuichiroh Matsubayashi, Tomoya Mizumoto, Jun Suzuki, Kentaro Inui

Towards guaranteeing high-quality predictions, we present the first study of exploring the use of human-in-the-loop framework for minimizing the grading cost while guaranteeing the grading quality by allowing a SAS model to share the grading task with a human grader.

Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution

1 code implementation EMNLP 2021 Ryuto Konno, Shun Kiyono, Yuichiroh Matsubayashi, Hiroki Ouchi, Kentaro Inui

Masked language models (MLMs) have contributed to drastic performance improvements with regard to zero anaphora resolution (ZAR).

An Empirical Study of Contextual Data Augmentation for Japanese Zero Anaphora Resolution

no code implementations COLING 2020 Ryuto Konno, Yuichiroh Matsubayashi, Shun Kiyono, Hiroki Ouchi, Ryo Takahashi, Kentaro Inui

This study addresses two underexplored issues on CDA, that is, how to reduce the computational cost of data augmentation and how to ensure the quality of the generated data.

Data Augmentation Language Modelling +4

Distance-Free Modeling of Multi-Predicate Interactions in End-to-End Japanese Predicate-Argument Structure Analysis

no code implementations COLING 2018 Yuichiroh Matsubayashi, Kentaro Inui

Capturing interactions among multiple predicate-argument structures (PASs) is a crucial issue in the task of analyzing PAS in Japanese.

Revisiting the Design Issues of Local Models for Japanese Predicate-Argument Structure Analysis

no code implementations IJCNLP 2017 Yuichiroh Matsubayashi, Kentaro Inui

The research trend in Japanese predicate-argument structure (PAS) analysis is shifting from pointwise prediction models with local features to global models designed to search for globally optimal solutions.

Modeling Context-sensitive Selectional Preference with Distributed Representations

no code implementations COLING 2016 Naoya Inoue, Yuichiroh Matsubayashi, Masayuki Ono, Naoaki Okazaki, Kentaro Inui

This paper proposes a novel problem setting of selectional preference (SP) between a predicate and its arguments, called as context-sensitive SP (CSP).

Semantic Role Labeling

Cannot find the paper you are looking for? You can Submit a new open access paper.