no code implementations • EMNLP (sdp) 2020 • Thomas van Dongen, Gideon Maillette de Buy Wenniger, Lambert Schomaker
We also show the merit of using more training data and longer input for number of citations prediction.
no code implementations • 15 Aug 2023 • Gideon Maillette de Buy Wenniger, Thomas van Dongen, Lambert Schomaker
Using BERT$_{\textrm{BASE}}$ embeddings, on the (log) number of citations prediction task with the ACL-BiblioMetry dataset, our MultiSChuBERT (text+visual) model obtains an $R^{2}$ score of 0. 454 compared to 0. 432 for the SChuBERT (text only) model.
no code implementations • 10 Sep 2021 • Pieter Floris Jacobs, Gideon Maillette de Buy Wenniger, Marco Wiering, Lambert Schomaker
Furthermore, we explore the influence of the query-pool size on the performance of AL.
no code implementations • 21 Dec 2020 • Thomas van Dongen, Gideon Maillette de Buy Wenniger, Lambert Schomaker
We also show the merit of using more training data and longer input for number of citations prediction.
no code implementations • EMNLP (sdp) 2020 • Gideon Maillette de Buy Wenniger, Thomas van Dongen, Eleri Aedmaa, Herbert Teun Kruitbosch, Edwin A. Valentijn, Lambert Schomaker
To tackle these problems, we propose the use of HANs combined with structure-tags which mark the role of sentences in the document.
no code implementations • 9 Sep 2019 • Alberto Poncelas, Maja Popovic, Dimitar Shterionov, Gideon Maillette de Buy Wenniger, Andy Way
Neural Machine Translation (NMT) models achieve their best performance when large sets of parallel data are used for training.
no code implementations • RANLP 2019 • Alberto Poncelas, Maja Popovi{\'c}, Dimitar Shterionov, Gideon Maillette de Buy Wenniger, Andy Way
Neural Machine Translation (NMT) models achieve their best performance when large sets of parallel data are used for training.
no code implementations • WS 2019 • Alberto Poncelas, Gideon Maillette de Buy Wenniger, Andy Way
Machine Translation models are trained to translate a variety of documents from one language into another.
no code implementations • 18 Jun 2019 • Alberto Poncelas, Gideon Maillette de Buy Wenniger, Andy Way
These methods ensure that selected sentences share n-grams with the test set so the NMT model can be adapted to translate it.
1 code implementation • 28 Feb 2019 • Gideon Maillette de Buy Wenniger, Lambert Schomaker, Andy Way
Neural handwriting recognition (NHR) is the recognition of handwritten text with deep learning models, such as multi-dimensional long short-term memory (MDLSTM) recurrent neural networks.
Ranked #15 on Handwritten Text Recognition on IAM
no code implementations • IWSLT (EMNLP) 2018 • Alberto Poncelas, Gideon Maillette de Buy Wenniger, Andy Way
A limitation of these methods to date is that using the source-side test set does not by itself guarantee that sentences are selected with correct translations, or translations that are suitable given the test-set domain.
no code implementations • 17 Apr 2018 • Alberto Poncelas, Dimitar Shterionov, Andy Way, Gideon Maillette de Buy Wenniger, Peyman Passban
A prerequisite for training corpus-based machine translation (MT) systems -- either Statistical MT (SMT) or Neural MT (NMT) -- is the availability of high-quality parallel data.