Search Results for author: Yuta Hitomi

Found 4 papers, 1 papers with code

Transformer-based Lexically Constrained Headline Generation

1 code implementation EMNLP 2021 Kosuke Yamada, Yuta Hitomi, Hideaki Tamori, Ryohei Sasano, Naoaki Okazaki, Kentaro Inui, Koichi Takeda

We also consider a new headline generation strategy that takes advantage of the controllable generation order of Transformer.

Headline Generation

A Large-Scale Multi-Length Headline Corpus for Analyzing Length-Constrained Headline Generation Model Evaluation

no code implementations WS 2019 Yuta Hitomi, Yuya Taguchi, Hideaki Tamori, Ko Kikuta, Jiro Nishitoba, Naoaki Okazaki, Kentaro Inui, Manabu Okumura

However, because there is no corpus of headlines of multiple lengths for a given article, previous research on controlling output length in headline generation has not discussed whether the system outputs could be adequately evaluated without multiple references of different lengths.

Headline Generation

Analyzing the Revision Logs of a Japanese Newspaper for Article Quality Assessment

no code implementations WS 2017 Hideaki Tamori, Yuta Hitomi, Naoaki Okazaki, Kentaro Inui

We address the issue of the quality of journalism and analyze daily article revision logs from a Japanese newspaper company.

Cannot find the paper you are looking for? You can Submit a new open access paper.