Search Results for author: Judit Ács

Found 11 papers, 8 papers with code

BME-TUW at SR’20: Lexical grammar induction for surface realization

no code implementations MSR (COLING) 2020 Gábor Recski, Ádám Kovács, Kinga Gémes, Judit Ács, Andras Kornai

We present a system for mapping Universal Dependency structures to raw text which learns to restore word order by training an Interpreted Regular Tree Grammar (IRTG) that establishes a mapping between string and graph operations.

TreeSwap: Data Augmentation for Machine Translation via Dependency Subtree Swapping

1 code implementation4 Nov 2023 Attila Nagy, Dorina Lakatos, Botond Barta, Judit Ács

Data augmentation methods for neural machine translation are particularly useful when limited amount of training data is available, which is often the case when dealing with low-resource languages.

Data Augmentation Machine Translation +1

Data Augmentation for Machine Translation via Dependency Subtree Swapping

2 code implementations13 Jul 2023 Attila Nagy, Dorina Petra Lakatos, Botond Barta, Patrick Nanys, Judit Ács

We present a generic framework for data augmentation via dependency subtree swapping that is applicable to machine translation.

Data Augmentation Machine Translation +1

HunSum-1: an Abstractive Summarization Dataset for Hungarian

1 code implementation1 Feb 2023 Botond Barta, Dorina Lakatos, Attila Nagy, Milán Konor Nyist, Judit Ács

We introduce HunSum-1: a dataset for Hungarian abstractive summarization, consisting of 1. 14M news articles.

Abstractive Text Summarization

Syntax-based data augmentation for Hungarian-English machine translation

2 code implementations18 Jan 2022 Attila Nagy, Patrick Nanys, Balázs Frey Konrád, Bence Bial, Judit Ács

We train Transformer-based neural machine translation models for Hungarian-English and English-Hungarian using the Hunglish2 corpus.

Data Augmentation Machine Translation +1

Evaluating Transferability of BERT Models on Uralic Languages

1 code implementation ACL (IWCLUL) 2021 Judit Ács, Dániel Lévai, András Kornai

Transformer-based language models such as BERT have outperformed previous models on a large number of English benchmarks, but their evaluation is often limited to English or a small number of well-resourced languages.

Hyperparameter Optimization NER +1

Subword Pooling Makes a Difference

1 code implementation22 Feb 2021 Judit Ács, Ákos Kádár, András Kornai

For POS tagging both of these strategies perform poorly and the best choice is to use a small LSTM over the subwords.


Evaluating Contextualized Language Models for Hungarian

1 code implementation22 Feb 2021 Judit Ács, Dániel Lévai, Dávid Márk Nemeskey, András Kornai

We present an extended comparison of contextualized language models for Hungarian.


Automatic punctuation restoration with BERT models

1 code implementation18 Jan 2021 Attila Nagy, Bence Bial, Judit Ács

We present an approach for automatic punctuation restoration with BERT models for English and Hungarian.

Punctuation Restoration

Cannot find the paper you are looking for? You can Submit a new open access paper.