Search Results for author: Tomasz Dwojak

Found 12 papers, 5 papers with code

STable: Table Generation Framework for Encoder-Decoder Models

no code implementations8 Jun 2022 Michał Pietruszka, Michał Turski, Łukasz Borchmann, Tomasz Dwojak, Gabriela Pałka, Karolina Szyndler, Dawid Jurkiewicz, Łukasz Garncarek

The output structure of database-like tables, consisting of values structured in horizontal rows and vertical columns identifiable by name, can cover a wide range of NLP tasks.

Joint Entity and Relation Extraction Knowledge Base Population

Going Full-TILT Boogie on Document Understanding with Text-Image-Layout Transformer

1 code implementation18 Feb 2021 Rafał Powalski, Łukasz Borchmann, Dawid Jurkiewicz, Tomasz Dwojak, Michał Pietruszka, Gabriela Pałka

We address the challenging problem of Natural Language Comprehension beyond plain-text documents by introducing the TILT neural network architecture which simultaneously learns layout information, visual features, and textual semantics.

Ranked #7 on Visual Question Answering (VQA) on InfographicVQA (using extra training data)

Document Image Classification document understanding +1

From Dataset Recycling to Multi-Property Extraction and Beyond

1 code implementation CONLL 2020 Tomasz Dwojak, Michał Pietruszka, Łukasz Borchmann, Jakub Chłędowski, Filip Graliński

This paper investigates various Transformer architectures on the WikiReading Information Extraction and Machine Reading Comprehension dataset.

Machine Reading Comprehension

On the Multi-Property Extraction and Beyond

no code implementations15 Jun 2020 Tomasz Dwojak, Michał Pietruszka, Łukasz Borchmann, Filip Graliński, Jakub Chłędowski

In this paper, we investigate the Dual-source Transformer architecture on the WikiReading information extraction and machine reading comprehension dataset.

Machine Reading Comprehension

Fast Neural Machine Translation Implementation

no code implementations WS 2018 Hieu Hoang, Tomasz Dwojak, Rihards Krislauks, Daniel Torregrosa, Kenneth Heafield

This paper describes the submissions to the efficiency track for GPUs at the Workshop for Neural Machine Translation and Generation by members of the University of Edinburgh, Adam Mickiewicz University, Tilde and University of Alicante.

Machine Translation Translation

Marian: Fast Neural Machine Translation in C++

2 code implementations ACL 2018 Marcin Junczys-Dowmunt, Roman Grundkiewicz, Tomasz Dwojak, Hieu Hoang, Kenneth Heafield, Tom Neckermann, Frank Seide, Ulrich Germann, Alham Fikri Aji, Nikolay Bogoychev, André F. T. Martins, Alexandra Birch

We present Marian, an efficient and self-contained Neural Machine Translation framework with an integrated automatic differentiation engine based on dynamic computation graphs.

Machine Translation Translation

Predicting Target Language CCG Supertags Improves Neural Machine Translation

no code implementations WS 2017 Maria Nadejde, Siva Reddy, Rico Sennrich, Tomasz Dwojak, Marcin Junczys-Dowmunt, Philipp Koehn, Alexandra Birch

Our results on WMT data show that explicitly modeling target-syntax improves machine translation quality for German->English, a high-resource pair, and for Romanian->English, a low-resource pair and also several syntactic phenomena including prepositional phrase attachment.

Machine Translation NMT +2

Is Neural Machine Translation Ready for Deployment? A Case Study on 30 Translation Directions

2 code implementations IWSLT 2016 Marcin Junczys-Dowmunt, Tomasz Dwojak, Hieu Hoang

In this paper we provide the largest published comparison of translation quality for phrase-based SMT and neural machine translation across 30 translation directions.

Machine Translation Sentence +1

The AMU-UEDIN Submission to the WMT16 News Translation Task: Attention-based NMT Models as Feature Functions in Phrase-based SMT

1 code implementation WS 2016 Marcin Junczys-Dowmunt, Tomasz Dwojak, Rico Sennrich

For the Russian-English task, our submission achieves the top BLEU result, outperforming the best pure neural system by 1. 1 BLEU points and our own phrase-based baseline by 1. 6 BLEU.

Machine Translation NMT +1

Cannot find the paper you are looking for? You can Submit a new open access paper.