Search Results for author: Shun Kiyono

Found 21 papers, 9 papers with code

Mixture of Expert/Imitator Networks: Scalable Semi-supervised Learning Framework

no code implementations13 Oct 2018 Shun Kiyono, Jun Suzuki, Kentaro Inui

We also demonstrate that our method has the more data, better performance property with promising scalability to the amount of unlabeled data.

General Classification text-classification +1

Effective Adversarial Regularization for Neural Machine Translation

1 code implementation ACL 2019 Motoki Sato, Jun Suzuki, Shun Kiyono

A regularization technique based on adversarial perturbation, which was initially developed in the field of image processing, has been successfully applied to text classification tasks and has yielded attractive improvements.

Machine Translation NMT +3

An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction

1 code implementation IJCNLP 2019 Shun Kiyono, Jun Suzuki, Masato Mita, Tomoya Mizumoto, Kentaro Inui

The incorporation of pseudo data in the training of grammatical error correction models has been one of the main factors in improving the performance of such models.

Grammatical Error Correction

Riposte! A Large Corpus of Counter-Arguments

no code implementations8 Oct 2019 Paul Reisert, Benjamin Heinzerling, Naoya Inoue, Shun Kiyono, Kentaro Inui

Counter-arguments (CAs), one form of constructive feedback, have been proven to be useful for critical thinking skills.

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

1 code implementation ACL 2020 Masahiro Kaneko, Masato Mita, Shun Kiyono, Jun Suzuki, Kentaro Inui

The answer to this question is not as straightforward as one might expect because the previous common methods for incorporating a MLM into an EncDec model have potential drawbacks when applied to GEC.

Grammatical Error Correction Language Modelling

An Empirical Study of Contextual Data Augmentation for Japanese Zero Anaphora Resolution

no code implementations COLING 2020 Ryuto Konno, Yuichiroh Matsubayashi, Shun Kiyono, Hiroki Ouchi, Ryo Takahashi, Kentaro Inui

This study addresses two underexplored issues on CDA, that is, how to reduce the computational cost of data augmentation and how to ensure the quality of the generated data.

Data Augmentation Language Modelling +4

Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution

1 code implementation EMNLP 2021 Ryuto Konno, Shun Kiyono, Yuichiroh Matsubayashi, Hiroki Ouchi, Kentaro Inui

Masked language models (MLMs) have contributed to drastic performance improvements with regard to zero anaphora resolution (ZAR).

SHAPE: Shifted Absolute Position Embedding for Transformers

1 code implementation13 Sep 2021 Shun Kiyono, Sosuke Kobayashi, Jun Suzuki, Kentaro Inui

Position representation is crucial for building position-aware representations in Transformers.

Position

B2T Connection: Serving Stability and Performance in Deep Transformers

1 code implementation1 Jun 2022 Sho Takase, Shun Kiyono, Sosuke Kobayashi, Jun Suzuki

Recent Transformers tend to be Pre-LN because, in Post-LN with deep Transformers (e. g., those with ten or more layers), the training is often unstable, resulting in useless models.

Text Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.