no code implementations • BioNLP (ACL) 2022 • Naoki Iinuma, Makoto Miwa, Yutaka Sasaki
To overcome this issue, our methods indirectly utilize distant supervision data with manually annotated training data.
no code implementations • BioNLP (ACL) 2022 • Hai-Long Trieu, Makoto Miwa, Sophia Ananiadou
Cancer immunology research involves several important cell and protein factors.
no code implementations • 6 Jun 2024 • Kohei Makino, Makoto Miwa, Yutaka Sasaki
Furthermore, the analysis of instances retrieved by the end-to-end trained retriever confirms that the retrieved instances contain common relation labels or entities with the query and are specialized for the target task.
1 code implementation • 10 Feb 2023 • Nhung T. H. Nguyen, Makoto Miwa, Sophia Ananiadou
For one type of IB model, we incorporate two unsupervised generative components, span reconstruction and synonym generation, into a span-based NER system.
1 code implementation • ACL 2022 • Jake Vasilakes, Chrysoula Zerva, Makoto Miwa, Sophia Ananiadou
Negation and uncertainty modeling are long-standing tasks in natural language processing.
no code implementations • 28 Sep 2021 • Hayato Futase, Tomoki Tsujimura, Tetsuya Kajimoto, Hajime Kawarazaki, Toshiyuki Suzuki, Makoto Miwa, Yutaka Sasaki
Furthermore, it is difficult to generate the changes at a specific timing and they often do not match with actual changes.
1 code implementation • 27 Jun 2021 • Fusataka Kuniyoshi, Jun Ozawa, Makoto Miwa
In the field of inorganic materials science, there is a growing demand to extract knowledge such as physical properties and synthesis processes of materials by machine-reading a large number of papers.
1 code implementation • Findings (ACL) 2021 • Kohei Makino, Makoto Miwa, Yutaka Sasaki
In this paper, we propose a novel edge-editing approach to extract relation information from a document.
no code implementations • NAACL 2021 • Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
We propose a multi-task, probabilistic approach to facilitate distantly supervised relation extraction by bringing closer the representations of sentences that contain the same Knowledge Base pairs.
Ranked #4 on
Relation Extraction
on NYT Corpus
1 code implementation • Workshop on Noisy User-generated Text 2020 • Mohammad Golam Sohrab, Anh-Khoa Duong Nguyen, Makoto Miwa, Hiroya Takamura
In relation extraction task, we achieved 80. 46% in terms of F-score as the top system in the relation extraction or recognition task.
Ranked #1 on
Named Entity Recognition (NER)
on WNUT 2020
1 code implementation • 24 Oct 2020 • Masaki Asada, Makoto Miwa, Yutaka Sasaki
Specifically, we focus on drug description and molecular structure information as the drug database information.
no code implementations • EMNLP 2020 • Mohammad Golam Sohrab, Khoa Duong, Makoto Miwa, Goran Topi{\'c}, Ikeda Masami, Takamura Hiroya
We present a biomedical entity linking (EL) system BENNERD that detects named enti- ties in text and links them to the unified medical language system (UMLS) knowledge base (KB) entries to facilitate the corona virus disease 2019 (COVID-19) research.
1 code implementation • 17 Jun 2020 • Hai-Long Trieu, Thy Thy Tran, Khoa N A Duong, Anh Nguyen, Makoto Miwa, Sophia Ananiadou
Motivation Recent neural approaches on event extraction from text mainly focus on flat events in general domain, while there are less attempts to detect nested and overlapping events.
Ranked #1 on
Event Extraction
on GENIA 2013
no code implementations • LREC 2020 • Savong Bou, Naoki Suzuki, Makoto Miwa, Yutaka Sasaki
In contrast, in our OSR annotation, a relation is annotated as a relation mention (i. e., not a link but a node) and domain and range links are annotated from the relation mention to its argument entity mentions.
no code implementations • LREC 2020 • Fusataka Kuniyoshi, Kohei Makino, Jun Ozawa, Makoto Miwa
In this work, we present a novel corpus of the synthesis process for all-solid-state batteries and an automated machine reading system for extracting the synthesis processes buried in the scientific literature.
no code implementations • WS 2019 • Hai-Long Trieu, Anh-Khoa Duong Nguyen, Nhung Nguyen, Makoto Miwa, Hiroya Takamura, Sophia Ananiadou
Additionally, the proposed model is able to detect coreferent pairs in long distances, even with a distance of more than 200 sentences.
no code implementations • WS 2019 • Mohammad Golam Sohrab, Minh Thang Pham, Makoto Miwa, Hiroya Takamura
We present a neural pipeline approach that performs named entity recognition (NER) and concept indexing (CI), which links them to concept unique identifiers (CUIs) in a knowledge base, for the PharmaCoNER shared task on pharmaceutical drugs and chemical entities.
no code implementations • IJCNLP 2019 • Kurt Espinosa, Makoto Miwa, Sophia Ananiadou
We tackle the nested and overlapping event detection task and propose a novel search-based neural network (SBNN) structured prediction model that treats the task as a search problem on a relation graph of trigger-argument structures.
1 code implementation • IJCNLP 2019 • Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
We thus propose an edge-oriented graph neural model for document-level relation extraction.
no code implementations • ACL 2019 • Sunil Kumar Sahu, Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
Inter-sentence relation extraction deals with a number of complex semantic relationships in documents, which require local, non-local, syntactic and semantic dependencies.
1 code implementation • ACL 2018 • Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
We present a novel graph-based neural network model for relation extraction.
Ranked #1 on
Relation Extraction
on ACE 2005
(Cross Sentence metric)
no code implementations • EMNLP 2018 • Mohammad Golam Sohrab, Makoto Miwa
We propose a simple deep neural model for nested named entity recognition (NER).
no code implementations • WS 2018 • Hai-Long Trieu, Nhung T. H. Nguyen, Makoto Miwa, Sophia Ananiadou
Existing biomedical coreference resolution systems depend on features and/or rules based on syntactic parsers.
1 code implementation • NAACL 2018 • Meizhi Ju, Makoto Miwa, Sophia Ananiadou
Each flat NER layer is based on the state-of-the-art flat NER model that captures sequential context representation with bidirectional Long Short-Term Memory (LSTM) layer and feeds it to the cascaded CRF layer.
Ranked #10 on
Nested Mention Recognition
on ACE 2005
no code implementations • ACL 2018 • Masaki Asada, Makoto Miwa, Yutaka Sasaki
We propose a novel neural method to extract drug-drug interactions (DDIs) from texts using external drug molecular structure information.
no code implementations • IJCNLP 2017 • Yota Toyama, Makoto Miwa, Yutaka Sasaki
We propose a novel method that exploits visual information of ideograms and logograms in analyzing Japanese review documents.
no code implementations • IJCNLP 2017 • Satoshi Yawata, Makoto Miwa, Yutaka Sasaki, Daisuke Hara
We define a fine-grained feature set based on the hand-coded syllables and train a logistic regression classifier on labeled syllables, expecting to find the discriminative features from the trained classifier.
no code implementations • WS 2017 • Masaki Asada, Makoto Miwa, Yutaka Sasaki
We propose a novel attention mechanism for a Convolutional Neural Network (CNN)-based Drug-Drug Interaction (DDI) extraction model.
no code implementations • SEMEVAL 2017 • Tomoki Tsujimura, Makoto Miwa, Yutaka Sasaki
This paper describes our TTI-COIN system that participated in SemEval-2017 Task 10.
no code implementations • 16 Jun 2017 • Takuma Yoneda, Koki Mori, Makoto Miwa, Yutaka Sasaki
We propose a novel embedding model that represents relationships among several elements in bibliographic information with high representation ability and flexibility.
no code implementations • EACL 2017 • Takuma Yoneda, Koki Mori, Makoto Miwa, Yutaka Sasaki
We propose a novel embedding model that represents relationships among several elements in bibliographic information with high representation ability and flexibility.
no code implementations • COLING 2016 • Josuke Yamane, Tomoya Takatani, Hitoshi Yamada, Makoto Miwa, Yutaka Sasaki
Most of the recent hypernym detection models focus on a hypernymy classification problem that determines whether a pair of words is in hypernymy or not.
no code implementations • LREC 2016 • Yannis Korkontzelos, Beverley Thomas, Makoto Miwa, Sophia Ananiadou
Classifying research grants into useful categories is a vital task for a funding body to give structure to the portfolio for analysis, informing strategic planning and decision-making.
2 code implementations • ACL 2016 • Makoto Miwa, Mohit Bansal
We present a novel end-to-end neural model to extract entities and relations between them.
Ranked #1 on
Relation Extraction
on ACE 2005
(Sentence Encoder metric)
no code implementations • CONLL 2015 • Kazuma Hashimoto, Pontus Stenetorp, Makoto Miwa, Yoshimasa Tsuruoka
We present a novel learning method for word embeddings designed for relation classification.