Search Results for author: Dirk Weissenborn

Found 22 papers, 11 papers with code

Differentiable Patch Selection for Image Recognition

no code implementations CVPR 2021 Jean-Baptiste Cordonnier, Aravindh Mahendran, Alexey Dosovitskiy, Dirk Weissenborn, Jakob Uszkoreit, Thomas Unterthiner

Neural Networks require large amounts of memory and compute to process high resolution images, even when only a small part of the image is actually informative for the task at hand.

Traffic Sign Recognition

Colorization Transformer

2 code implementations ICLR 2021 Manoj Kumar, Dirk Weissenborn, Nal Kalchbrenner

We present the Colorization Transformer, a novel approach for diverse high fidelity image colorization based on self-attention.

Colorization Image Colorization

Object-Centric Learning with Slot Attention

8 code implementations NeurIPS 2020 Francesco Locatello, Dirk Weissenborn, Thomas Unterthiner, Aravindh Mahendran, Georg Heigold, Jakob Uszkoreit, Alexey Dosovitskiy, Thomas Kipf

Learning object-centric representations of complex scenes is a promising step towards enabling efficient abstract reasoning from low-level perceptual features.

Object Object Discovery +1

Axial Attention in Multidimensional Transformers

2 code implementations20 Dec 2019 Jonathan Ho, Nal Kalchbrenner, Dirk Weissenborn, Tim Salimans

We propose Axial Transformers, a self-attention-based autoregressive model for images and other data organized as high dimensional tensors.

Ranked #26 on Image Generation on ImageNet 64x64 (Bits per dim metric)

Image Generation

Scaling Autoregressive Video Models

1 code implementation ICLR 2020 Dirk Weissenborn, Oscar Täckström, Jakob Uszkoreit

Due to the statistical complexity of video, the high degree of inherent stochasticity, and the sheer amount of data, generating natural video remains a challenging task.

Action Recognition Video Generation +1

Jack the Reader -- A Machine Reading Framework

1 code implementation ACL 2018 Dirk Weissenborn, Pasquale Minervini, Isabelle Augenstein, Johannes Welbl, Tim Rockt{\"a}schel, Matko Bo{\v{s}}njak, Jeff Mitchell, Thomas Demeester, Tim Dettmers, Pontus Stenetorp, Sebastian Riedel

For example, in Question Answering, the supporting text can be newswire or Wikipedia articles; in Natural Language Inference, premises can be seen as the supporting text and hypotheses as questions.

Information Retrieval Link Prediction +4

Jack the Reader - A Machine Reading Framework

2 code implementations20 Jun 2018 Dirk Weissenborn, Pasquale Minervini, Tim Dettmers, Isabelle Augenstein, Johannes Welbl, Tim Rocktäschel, Matko Bošnjak, Jeff Mitchell, Thomas Demeester, Pontus Stenetorp, Sebastian Riedel

For example, in Question Answering, the supporting text can be newswire or Wikipedia articles; in Natural Language Inference, premises can be seen as the supporting text and hypotheses as questions.

Link Prediction Natural Language Inference +3

Cross-lingual Candidate Search for Biomedical Concept Normalization

no code implementations4 May 2018 Roland Roller, Madeleine Kittner, Dirk Weissenborn, Ulf Leser

Biomedical concept normalization links concept mentions in texts to a semantically equivalent concept in a biomedical knowledge base.

Translation

Neural Question Answering at BioASQ 5B

no code implementations WS 2017 Georg Wiese, Dirk Weissenborn, Mariana Neves

We focus on factoid and list question, using an extractive QA model, that is, we restrict our system to output substrings of the provided text snippets.

Question Answering Word Embeddings

Neural Domain Adaptation for Biomedical Question Answering

1 code implementation CONLL 2017 Georg Wiese, Dirk Weissenborn, Mariana Neves

However, these systems have not yet been applied to QA in more specific domains, such as biomedicine, because datasets are generally too small to train a DL system from scratch.

Domain Adaptation Question Answering +2

Dynamic Integration of Background Knowledge in Neural NLU Systems

no code implementations ICLR 2018 Dirk Weissenborn, Tomáš Kočiský, Chris Dyer

Common-sense and background knowledge is required to understand natural language, but in most neural natural language understanding (NLU) systems, this knowledge must be acquired from training corpora during learning, and then it is static at test time.

Common Sense Reasoning Natural Language Inference +3

Making Neural QA as Simple as Possible but not Simpler

3 code implementations CONLL 2017 Dirk Weissenborn, Georg Wiese, Laura Seiffe

We argue that this surprising finding puts results of previous systems and the complexity of recent QA datasets into perspective.

Question Answering Reading Comprehension

SynsetRank: Degree-adjusted Random Walk for Relation Identification

no code implementations2 Sep 2016 Shinichi Nakajima, Sebastian Krause, Dirk Weissenborn, Sven Schmeier, Nico Goernitz, Feiyu Xu

In relation extraction, a key process is to obtain good detectors that find relevant sentences describing the target relation.

Relation Relation Extraction

Separating Answers from Queries for Neural Reading Comprehension

no code implementations12 Jul 2016 Dirk Weissenborn

We present a novel neural architecture for answering queries, designed to optimally leverage explicit support in the form of query-answer memories.

Reading Comprehension Retrieval

Neural Associative Memory for Dual-Sequence Modeling

1 code implementation WS 2016 Dirk Weissenborn

In this work we propose a new architecture for dual-sequence modeling that is based on associative memory.

Natural Language Inference

MuFuRU: The Multi-Function Recurrent Unit

no code implementations9 Jun 2016 Dirk Weissenborn, Tim Rocktäschel

Recurrent neural networks such as the GRU and LSTM found wide adoption in natural language processing and achieve state-of-the-art results for many tasks.

Language Modelling Sentiment Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.