Search Results for author: Takuya Makino

Found 4 papers, 0 papers with code

Multi-Task Learning for Chemical Named Entity Recognition with Chemical Compound Paraphrasing

no code implementations IJCNLP 2019 Taiki Watanabe, Akihiro Tamura, Takashi Ninomiya, Takuya Makino, Tomoya Iwakura

We propose a method to improve named entity recognition (NER) for chemical compounds using multi-task learning by jointly training a chemical NER model and a chemical com- pound paraphrase model.

Multi-Task Learning named-entity-recognition +2

Global Optimization under Length Constraint for Neural Text Summarization

no code implementations ACL 2019 Takuya Makino, Tomoya Iwakura, Hiroya Takamura, Manabu Okumura

The experimental results show that a state-of-the-art neural summarization model optimized with GOLC generates fewer overlength summaries while maintaining the fastest processing speed; only 6. 70{\%} overlength summaries on CNN/Daily and 7. 8{\%} on long summary of Mainichi, compared to the approximately 20{\%} to 50{\%} on CNN/Daily Mail and 10{\%} to 30{\%} on Mainichi with the other optimization methods.

Document Summarization

Cannot find the paper you are looking for? You can Submit a new open access paper.