no code implementations • BioNLP (ACL) 2022 • Taiki Watanabe, Tomoya Ichikawa, Akihiro Tamura, Tomoya Iwakura, Chunpeng Ma, Tsuneo Kato
As one of the NER improvement approaches, multi-task learning that learns a model from multiple training data has been used.
no code implementations • RANLP 2021 • Satoshi Hiai, Kazutaka Shimada, Taiki Watanabe, Akiva Miura, Tomoya Iwakura
In addition, our method shows approximately three times faster extraction speed than the BERT-based models on the ChemProt corpus and reduces the memory size to one sixth of the BERT ones.
no code implementations • IJCNLP 2019 • Taiki Watanabe, Akihiro Tamura, Takashi Ninomiya, Takuya Makino, Tomoya Iwakura
We propose a method to improve named entity recognition (NER) for chemical compounds using multi-task learning by jointly training a chemical NER model and a chemical com- pound paraphrase model.
no code implementations • IJCNLP 2017 • Taiki Watanabe, Akihiro Tamura, Takashi Ninomiya
This paper proposes a new attention mechanism for neural machine translation (NMT) based on convolutional neural networks (CNNs), which is inspired by the CKY algorithm.