Search Results for author: Steven Schockaert

Found 58 papers, 18 papers with code

Combining BERT with Static Word Embeddings for Categorizing Social Media

no code implementations EMNLP (WNUT) 2020 Israa Alghanmi, Luis Espinosa Anke, Steven Schockaert

A particularly striking example is the performance of AraBERT, an LM for the Arabic language, which is successful in categorizing social media posts in Arabic dialects, despite only having been trained on Modern Standard Arabic.

Natural Language Processing Word Embeddings

Distilling Relation Embeddings from Pre-trained Language Models

1 code implementation21 Sep 2021 Asahi Ushio, Jose Camacho-Collados, Steven Schockaert

Among others, this makes it possible to distill high-quality word vectors from pre-trained language models.

Knowledge Graphs Language Modelling +1

Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection

1 code implementation ACL (RepL4NLP) 2021 Yixiao Wang, Zied Bouraoui, Luis Espinosa Anke, Steven Schockaert

Second, rather than learning a word vector directly, we use a topic model to partition the contexts in which words appear, and then learn different topic-specific vectors for each word.

Word Embeddings

Probing Pre-Trained Language Models for Disease Knowledge

1 code implementation Findings (ACL) 2021 Israa Alghanmi, Luis Espinosa-Anke, Steven Schockaert

Pre-trained language models such as ClinicalBERT have achieved impressive results on tasks such as medical Natural Language Inference.

Natural Language Inference

Few-shot Image Classification with Multi-Facet Prototypes

no code implementations1 Feb 2021 Kun Yan, Zied Bouraoui, Ping Wang, Shoaib Jameel, Steven Schockaert

The aim of few-shot learning (FSL) is to learn how to recognize image categories from a small number of training examples.

Classification Few-Shot Image Classification +1

Modelling General Properties of Nouns by Selectively Averaging Contextualised Embeddings

1 code implementation4 Dec 2020 Na Li, Zied Bouraoui, Jose Camacho Collados, Luis Espinosa-Anke, Qing Gu, Steven Schockaert

While the success of pre-trained language models has largely eliminated the need for high-quality static word vectors in many NLP applications, such vectors continue to play an important role in tasks where words need to be modelled in the absence of linguistic context.

Knowledge Base Completion

A Mixture-of-Experts Model for Learning Multi-Facet Entity Embeddings

1 code implementation COLING 2020 Rana Alshaikh, Zied Bouraoui, Shelan Jeawak, Steven Schockaert

This is exploited by an associated gating network, which uses pre-trained word vectors to encourage the properties that are modelled by a given embedding to be semantically coherent, i. e. to encourage each of the individual embeddings to capture a meaningful facet.

Entity Embeddings

Don't Patronize Me! An Annotated Dataset with Patronizing and Condescending Language towards Vulnerable Communities

no code implementations COLING 2020 Carla Pérez-Almendros, Luis Espinosa-Anke, Steven Schockaert

In this paper, we introduce a new annotated dataset which is aimed at supporting the development of NLP models to identify and categorize language that is patronizing or condescending towards vulnerable communities (e. g. refugees, homeless people, poor families).

Plausible Reasoning about EL-Ontologies using Concept Interpolation

no code implementations25 Jun 2020 Yazmín Ibáñez-García, Víctor Gutiérrez-Basulto, Steven Schockaert

In this paper, we instead propose an inductive inference mechanism which is based on a clear model-theoretic semantics, and can thus be tightly integrated with standard deductive reasoning.

Modelling Semantic Categories using Conceptual Neighborhood

no code implementations3 Dec 2019 Zied Bouraoui, Jose Camacho-Collados, Luis Espinosa-Anke, Steven Schockaert

Unfortunately, meaningful regions can be difficult to estimate, especially since we often have few examples of individuals that belong to a given category.

Natural Language Processing

Inducing Relational Knowledge from BERT

no code implementations28 Nov 2019 Zied Bouraoui, Jose Camacho-Collados, Steven Schockaert

Starting from a few seed instances of a given relation, we first use a large text corpus to find sentences that are likely to express this relation.

Language Modelling Natural Language Processing +1

Learning Conceptual Spaces with Disentangled Facets

no code implementations CONLL 2019 Rana Alshaikh, Zied Bouraoui, Steven Schockaert

To address this gap, we analyze how, and to what extent, a given vector space embedding can be decomposed into meaningful facets in an unsupervised fashion.

Natural Language Processing Word Embeddings

Meemi: A Simple Method for Post-processing and Integrating Cross-lingual Word Embeddings

1 code implementation16 Oct 2019 Yerai Doval, Jose Camacho-Collados, Luis Espinosa-Anke, Steven Schockaert

While monolingual word embeddings encode information about words in the context of a particular language, cross-lingual embeddings define a multilingual space where word embeddings from two or more languages are integrated together.

Cross-Lingual Natural Language Inference Cross-Lingual Word Embeddings +4

Learning Household Task Knowledge from WikiHow Descriptions

1 code implementation WS 2019 Yilun Zhou, Julie A. Shah, Steven Schockaert

Commonsense procedural knowledge is important for AI agents and robots that operate in a human environment.

On the Robustness of Unsupervised and Semi-supervised Cross-lingual Word Embedding Learning

no code implementations LREC 2020 Yerai Doval, Jose Camacho-Collados, Luis Espinosa-Anke, Steven Schockaert

Cross-lingual word embeddings are vector representations of words in different languages where words with similar meaning are represented by similar vectors, regardless of the language.

Cross-Lingual Word Embeddings Word Embeddings

Collocation Classification with Unsupervised Relation Vectors

1 code implementation ACL 2019 Luis Espinosa Anke, Steven Schockaert, Leo Wanner

Lexical relation classification is the task of predicting whether a certain relation holds between a given pair of words.

Classification General Classification +2

Word and Document Embedding with vMF-Mixture Priors on Context Word Vectors

no code implementations ACL 2019 Shoaib Jameel, Steven Schockaert

To this end, our model relies on the assumption that context word vectors are drawn from a mixture of von Mises-Fisher (vMF) distributions, where the parameters of this mixture distribution are jointly optimized with the word vectors.

Document Embedding

Relational Word Embeddings

1 code implementation ACL 2019 Jose Camacho-Collados, Luis Espinosa-Anke, Steven Schockaert

While word embeddings have been shown to implicitly encode various forms of attributional knowledge, the extent to which they capture relational information is far more limited.

Word Embeddings

Predicting ConceptNet Path Quality Using Crowdsourced Assessments of Naturalness

1 code implementation21 Feb 2019 Yilun Zhou, Steven Schockaert, Julie A. Shah

In this paper we instead propose to learn to predict path quality from crowdsourced human assessments.

Knowledge Graphs

Interpretable Emoji Prediction via Label-Wise Attention LSTMs

no code implementations EMNLP 2018 Francesco Barbieri, Luis Espinosa-Anke, Jose Camacho-Collados, Steven Schockaert, Horacio Saggion

Human language has evolved towards newer forms of communication such as social media, where emojis (i. e., ideograms bearing a visual meaning) play a key role.

Emotion Recognition Information Retrieval +3

SeVeN: Augmenting Word Embeddings with Unsupervised Relation Vectors

1 code implementation COLING 2018 Luis Espinosa-Anke, Steven Schockaert

For example, by examining clusters of relation vectors, we observe that relational similarities can be identified at a more abstract level than with traditional word vector differences.

Text Categorization Word Embeddings +1

Knowledge Representation with Conceptual Spaces

no code implementations COLING 2018 Steven Schockaert

Conceptual spaces, as proposed by Grdenfors, are similar to entity embeddings, but provide more structure.

Entity Embeddings Information Retrieval +2

Relation Induction in Word Embeddings Revisited

no code implementations COLING 2018 Zied Bouraoui, Shoaib Jameel, Steven Schockaert

Given a set of instances of some relation, the relation induction task is to predict which other word pairs are likely to be related in the same way.

Knowledge Base Completion Translation +1

Unsupervised Learning of Distributional Relation Vectors

no code implementations ACL 2018 Shoaib Jameel, Zied Bouraoui, Steven Schockaert

Word embedding models such as GloVe rely on co-occurrence statistics to learn vector representations of word meaning.

Relation Extraction Word Embeddings

Syntactically Aware Neural Architectures for Definition Extraction

no code implementations NAACL 2018 Luis Espinosa-Anke, Steven Schockaert

Automatically identifying definitional knowledge in text corpora (Definition Extraction or DE) is an important task with direct applications in, among others, Automatic Glossary Generation, Taxonomy Learning, Question Answering and Semantic Search.

Classification Definition Extraction +4

From Knowledge Graph Embedding to Ontology Embedding? An Analysis of the Compatibility between Vector Space Representations and Rules

no code implementations26 May 2018 Víctor Gutiérrez-Basulto, Steven Schockaert

To address this shortcoming, in this paper we introduce a general framework based on a view of relations as regions, which allows us to study the compatibility between ontological knowledge and different types of vector space embeddings.

Knowledge Graph Embedding Knowledge Graphs

Learning Conceptual Space Representations of Interrelated Concepts

no code implementations3 May 2018 Zied Bouraoui, Steven Schockaert

Several recently proposed methods aim to learn conceptual space representations from large text collections.

Knowledge Base Completion

VC-Dimension Based Generalization Bounds for Relational Learning

no code implementations17 Apr 2018 Ondrej Kuzelka, Yuyi Wang, Steven Schockaert

In many applications of relational learning, the available data can be seen as a sample from a larger relational structure (e. g. we may be given a small fragment from some social network).

Generalization Bounds Relational Reasoning

PAC-Reasoning in Relational Domains

no code implementations15 Mar 2018 Ondrej Kuzelka, Yuyi Wang, Jesse Davis, Steven Schockaert

We consider the problem of predicting plausible missing facts in relational data, given a set of imperfect logical rules.

Modeling Semantic Relatedness using Global Relation Vectors

no code implementations14 Nov 2017 Shoaib Jameel, Zied Bouraoui, Steven Schockaert

Word embedding models such as GloVe rely on co-occurrence statistics from a large corpus to learn vector representations of word meaning.

Stacked Structure Learning for Lifted Relational Neural Networks

no code implementations5 Oct 2017 Gustav Sourek, Martin Svatos, Filip Zelezny, Steven Schockaert, Ondrej Kuzelka

Lifted Relational Neural Networks (LRNNs) describe relational domains using weighted first-order rules which act as templates for constructing feed-forward neural networks.

Relational Marginal Problems: Theory and Estimation

no code implementations18 Sep 2017 Ondrej Kuzelka, Yuyi Wang, Jesse Davis, Steven Schockaert

In the propositional setting, the marginal problem is to find a (maximum-entropy) distribution that has some given marginals.

Probabilistic Relation Induction in Vector Space Embeddings

no code implementations21 Aug 2017 Zied Bouraoui, Shoaib Jameel, Steven Schockaert

Word embeddings have been found to capture a surprisingly rich amount of syntactic and semantic knowledge.

Word Embeddings

Modeling Context Words as Regions: An Ordinal Regression Approach to Word Embedding

no code implementations CONLL 2017 Shoaib Jameel, Steven Schockaert

Although region representations of word meaning offer a natural alternative to word vectors, only few methods have been proposed that can effectively learn word regions.

Natural Language Processing Word Embeddings

Jointly Learning Word Embeddings and Latent Topics

no code implementations21 Jun 2017 Bei Shi, Wai Lam, Shoaib Jameel, Steven Schockaert, Kwun Ping Lai

Word embedding models such as Skip-gram learn a vector-space representation for each word, based on the local word collocation patterns that are observed in a text corpus.

Learning Word Embeddings Topic Models

Induction of Interpretable Possibilistic Logic Theories from Relational Data

no code implementations19 May 2017 Ondrej Kuzelka, Jesse Davis, Steven Schockaert

Compared to Markov Logic Networks (MLNs), our method is faster and produces considerably more interpretable models.

Relational Reasoning

D-GloVe: A Feasible Least Squares Model for Estimating Word Embedding Densities

no code implementations COLING 2016 Shoaib Jameel, Steven Schockaert

We propose a new word embedding model, inspired by GloVe, which is formulated as a feasible least squares optimization problem.

Word Embeddings

Stratified Knowledge Bases as Interpretable Probabilistic Models (Extended Abstract)

no code implementations18 Nov 2016 Ondrej Kuzelka, Jesse Davis, Steven Schockaert

In this paper, we advocate the use of stratified logical theories for representing probabilistic models.

Learning Possibilistic Logic Theories from Default Rules

no code implementations18 Apr 2016 Ondrej Kuzelka, Jesse Davis, Steven Schockaert

We introduce a setting for learning possibilistic logic theories from defaults of the form "if alpha then typically beta".

Learning Theory

Entity Embeddings with Conceptual Subspaces as a Basis for Plausible Reasoning

no code implementations18 Feb 2016 Shoaib Jameel, Steven Schockaert

Conceptual spaces are geometric representations of conceptual knowledge, in which entities correspond to points, natural properties correspond to convex regions, and the dimensions of the space correspond to salient features.

Entity Embeddings

Solving stable matching problems using answer set programming

no code implementations16 Dec 2015 Sofie De Clercq, Steven Schockaert, Martine De Cock, Ann Nowé

Since the introduction of the stable marriage problem (SMP) by Gale and Shapley (1962), several variants and extensions have been investigated.

Encoding Markov Logic Networks in Possibilistic Logic

1 code implementation3 Jun 2015 Ondrej Kuzelka, Jesse Davis, Steven Schockaert

Markov logic uses weighted formulas to compactly encode a probability distribution over possible worlds.

Realizing RCC8 networks using convex regions

no code implementations9 Oct 2014 Steven Schockaert, Sanjiang Li

First, we identify all ways in which the set of RCC8 base relations can be restricted to guarantee that consistent networks can be convexly realized in respectively 1D, 2D, 3D, and 4D.

Characterizing and Extending Answer Set Semantics using Possibility Theory

no code implementations30 Nov 2013 Kim Bauters, Steven Schockaert, Martine De Cock, Dirk Vermeir

In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.

Modeling Stable Matching Problems with Answer Set Programming

no code implementations28 Feb 2013 Sofie De Clercq, Steven Schockaert, Martine De Cock, Ann Nowé

Our encoding can easily be extended and adapted to the needs of specific applications.

Cannot find the paper you are looking for? You can Submit a new open access paper.