Search Results for author: Julian Martin Eisenschlos

Found 10 papers, 8 papers with code

MATE: Multi-view Attention for Table Transformer Efficiency

1 code implementation EMNLP 2021 Julian Martin Eisenschlos, Maharshi Gor, Thomas Müller, William W. Cohen

However, more than 20% of relational tables on the web have 20 or more rows (Cafarella et al., 2008), and these large tables present a challenge for current Transformer models, which are typically limited to 512 tokens.

Question Answering

Time-Aware Language Models as Temporal Knowledge Bases

no code implementations29 Jun 2021 Bhuwan Dhingra, Jeremy R. Cole, Julian Martin Eisenschlos, Daniel Gillick, Jacob Eisenstein, William W. Cohen

We introduce a diagnostic dataset aimed at probing LMs for factual knowledge that changes over time and highlight problems with LMs at either end of the spectrum -- those trained on specific slices of temporal data, as well as those trained on a wide range of temporal data.

DoT: An efficient Double Transformer for NLP tasks with tables

1 code implementation Findings (ACL) 2021 Syrine Krichene, Thomas Müller, Julian Martin Eisenschlos

To improve efficiency while maintaining a high accuracy, we propose a new architecture, DoT, a double transformer model, that decomposes the problem into two sub-tasks: A shallow pruning transformer that selects the top-K tokens, followed by a deep task-specific transformer that takes as input those K tokens.

Question Answering

Fool Me Twice: Entailment from Wikipedia Gamification

1 code implementation NAACL 2021 Julian Martin Eisenschlos, Bhuwan Dhingra, Jannis Bulian, Benjamin Börschinger, Jordan Boyd-Graber

We release FoolMeTwice (FM2 for short), a large dataset of challenging entailment pairs collected through a fun multi-player game.

Open Domain Question Answering over Tables via Dense Retrieval

1 code implementation NAACL 2021 Jonathan Herzig, Thomas Müller, Syrine Krichene, Julian Martin Eisenschlos

Recent advances in open-domain QA have led to strong models based on dense retrieval, but only focused on retrieving textual passages.

Open-Domain Question Answering

Understanding tables with intermediate pre-training

1 code implementation Findings of the Association for Computational Linguistics 2020 Julian Martin Eisenschlos, Syrine Krichene, Thomas Müller

To be able to use long examples as input of BERT models, we evaluate table pruning techniques as a pre-processing step to drastically improve the training and prediction efficiency at a moderate drop in accuracy.

Data Augmentation Natural Language Inference +1

SoftSort: A Continuous Relaxation for the argsort Operator

1 code implementation29 Jun 2020 Sebastian Prillo, Julian Martin Eisenschlos

While sorting is an important procedure in computer science, the argsort operator - which takes as input a vector and returns its sorting permutation - has a discrete image and thus zero gradients almost everywhere.

Cannot find the paper you are looking for? You can Submit a new open access paper.