Search Results for author: Andrea Pierleoni

Found 7 papers, 2 papers with code

REXEL: An End-to-end Model for Document-Level Relation Extraction and Entity Linking

no code implementations19 Apr 2024 Nacime Bouziani, Shubhi Tyagi, Joseph Fisher, Jens Lehmann, Andrea Pierleoni

Extracting structured information from unstructured text is critical for many downstream NLP applications and is traditionally achieved by closed information extraction (cIE).

Coreference Resolution Document-level Closed Information Extraction +7

WebIE: Faithful and Robust Information Extraction on the Web

no code implementations23 May 2023 Chenxi Whitehouse, Clara Vania, Alham Fikri Aji, Christos Christodoulopoulos, Andrea Pierleoni

We evaluate the in-domain, out-of-domain, and zero-shot cross-lingual performance of generative IE models and find models trained on WebIE show better generalisability.

Entity Linking

ReFinED: An Efficient Zero-shot-capable Approach to End-to-End Entity Linking

2 code implementations NAACL (ACL) 2022 Tom Ayoola, Shubhi Tyagi, Joseph Fisher, Christos Christodoulopoulos, Andrea Pierleoni

The model is capable of generalising to large-scale knowledge bases such as Wikidata (which has 15 times more entities than Wikipedia) and of zero-shot entity linking.

 Ranked #1 on Entity Linking on WebQSP-WD (using extra training data)

Entity Disambiguation Entity Linking +1

Improving Entity Disambiguation by Reasoning over a Knowledge Base

2 code implementations NAACL 2022 Tom Ayoola, Joseph Fisher, Andrea Pierleoni

Recent work in entity disambiguation (ED) has typically neglected structured knowledge base (KB) facts, and instead relied on a limited subset of KB information, such as entity descriptions or types.

Entity Disambiguation

DARE: Data Augmented Relation Extraction with GPT-2

no code implementations6 Apr 2020 Yannis Papanikolaou, Andrea Pierleoni

Real-world Relation Extraction (RE) tasks are challenging to deal with, either due to limited training data or class imbalance issues.

Relation Relation Extraction

Deep Bidirectional Transformers for Relation Extraction without Supervision

no code implementations WS 2019 Yannis Papanikolaou, Ian Roberts, Andrea Pierleoni

We present a novel framework to deal with relation extraction tasks in cases where there is complete lack of supervision, either in the form of gold annotations, or relations from a knowledge base.

Language Modelling Relation +2

Reasoning Over Paths via Knowledge Base Completion

no code implementations WS 2019 Saatviga Sudhahar, Ian Roberts, Andrea Pierleoni

We demonstrate that our method is able to effectively rank a list of known paths between a pair of entities and also come up with plausible paths that are not present in the knowledge graph.

Knowledge Base Completion Knowledge Graphs

Cannot find the paper you are looking for? You can Submit a new open access paper.