Search Results for author: Matko Bošnjak

Found 8 papers, 6 papers with code

Improving fine-grained understanding in image-text pre-training

no code implementations18 Jan 2024 Ioana Bica, Anastasija Ilić, Matthias Bauer, Goker Erdogan, Matko Bošnjak, Christos Kaplanis, Alexey A. Gritsenko, Matthias Minderer, Charles Blundell, Razvan Pascanu, Jovana Mitrović

We introduce SPARse Fine-grained Contrastive Alignment (SPARC), a simple method for pretraining more fine-grained multimodal representations from image-text pairs.

object-detection Object Detection

SemPPL: Predicting pseudo-labels for better contrastive representations

2 code implementations12 Jan 2023 Matko Bošnjak, Pierre H. Richemond, Nenad Tomasev, Florian Strub, Jacob C. Walker, Felix Hill, Lars Holger Buesing, Razvan Pascanu, Charles Blundell, Jovana Mitrovic

We propose a new semi-supervised learning method, Semantic Positives via Pseudo-Labels (SemPPL), that combines labelled and unlabelled data to learn informative representations.

Contrastive Learning Pseudo Label

Differentiable Reasoning on Large Knowledge Bases and Natural Language

3 code implementations17 Dec 2019 Pasquale Minervini, Matko Bošnjak, Tim Rocktäschel, Sebastian Riedel, Edward Grefenstette

Reasoning with knowledge expressed in natural language and Knowledge Bases (KBs) is a major challenge for Artificial Intelligence, with applications in machine reading, dialogue, and question answering.

Link Prediction Question Answering +1

Jack the Reader - A Machine Reading Framework

2 code implementations20 Jun 2018 Dirk Weissenborn, Pasquale Minervini, Tim Dettmers, Isabelle Augenstein, Johannes Welbl, Tim Rocktäschel, Matko Bošnjak, Jeff Mitchell, Thomas Demeester, Pontus Stenetorp, Sebastian Riedel

For example, in Question Answering, the supporting text can be newswire or Wikipedia articles; in Natural Language Inference, premises can be seen as the supporting text and hypotheses as questions.

Link Prediction Natural Language Inference +3

emoji2vec: Learning Emoji Representations from their Description

7 code implementations WS 2016 Ben Eisner, Tim Rocktäschel, Isabelle Augenstein, Matko Bošnjak, Sebastian Riedel

Many current natural language processing applications for social media rely on representation learning and utilize pre-trained word embeddings.

Representation Learning Sentiment Analysis +1

Programming with a Differentiable Forth Interpreter

1 code implementation ICML 2017 Matko Bošnjak, Tim Rocktäschel, Jason Naradowsky, Sebastian Riedel

Given that in practice training data is scarce for all but a small set of problems, a core question is how to incorporate prior knowledge into a model.

Cannot find the paper you are looking for? You can Submit a new open access paper.