The first contribution of this thesis is HypER, a convolutional model which simplifies and improves upon the link prediction performance of the existing convolutional state-of-the-art model ConvE and can be mathematically explained in terms of constrained tensor factorisation.
Prompting language models (LMs) with training examples and task descriptions has been seen as critical to recent successes in few-shot learning.
In this work, we propose a probabilistically principled general approach to SSL that considers the distribution over label predictions, for labels of different complexity, from "one-hot" vectors to binary vectors and images.
Much progress has been made in semi-supervised learning (SSL) by combining methods that exploit different aspects of the data distribution, e. g. consistency regularisation relies on properties of $p(x)$, whereas entropy minimisation pertains to the label distribution $p(y|x)$.
Many models learn representations of knowledge graph data by exploiting its low-rank latent structure, encoding known relations between entities and enabling unknown facts to be inferred.
Hyperbolic embeddings have recently gained attention in machine learning due to their ability to represent hierarchical data more accurately and succinctly than their Euclidean analogues.
Ranked #33 on Link Prediction on WN18RR
Knowledge graphs are graphical representations of large databases of facts, which typically suffer from incompleteness.
Ranked #10 on Link Prediction on WN18
We show that different interactions between PMI vectors reflect semantic word relationships, such as similarity and paraphrasing, that are encoded in low dimensional word embeddings under a suitable projection, theoretically explaining why embeddings of W2V and GloVe work.