Search Results for author: Artem Sokolov

Found 28 papers, 6 papers with code

Controlling Machine Translation for Multiple Attributes with Additive Interventions

no code implementations EMNLP 2021 Andrea Schioppa, David Vilar, Artem Sokolov, Katja Filippova

Fine-grained control of machine translation (MT) outputs along multiple attributes is critical for many modern MT applications and is a requirement for gaining users’ trust.

Fine-tuning Machine Translation +1

Bandits Don’t Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits

no code implementations Findings (EMNLP) 2021 Julia Kreutzer, David Vilar, Artem Sokolov

Training data for machine translation (MT) is often sourced from a multitude of large corpora that are multi-faceted in nature, e. g. containing contents from multiple domains or different levels of quality or complexity.

Machine Translation Multi-Armed Bandits +1

Bandits Don't Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits

no code implementations13 Oct 2021 Julia Kreutzer, David Vilar, Artem Sokolov

Training data for machine translation (MT) is often sourced from a multitude of large corpora that are multi-faceted in nature, e. g. containing contents from multiple domains or different levels of quality or complexity.

Machine Translation Multi-Armed Bandits +1

Don't Search for a Search Method -- Simple Heuristics Suffice for Adversarial Text Attacks

no code implementations16 Sep 2021 Nathaniel Berger, Stefan Riezler, Artem Sokolov, Sebastian Ebert

Recently more attention has been given to adversarial attacks on neural networks for natural language processing (NLP).

Adversarial Text

Fixing exposure bias with imitation learning needs powerful oracles

no code implementations9 Sep 2021 Luca Hormann, Artem Sokolov

We apply imitation learning (IL) to tackle the NMT exposure bias problem with error-correcting oracles, and evaluate an SMT lattice-based oracle which, despite its excellent performance in an unconstrained oracle translation task, turned out to be too pruned and idiosyncratic to serve as the oracle for IL.

Imitation Learning Translation

Real-time Streaming Wave-U-Net with Temporal Convolutions for Multichannel Speech Enhancement

no code implementations5 Apr 2021 Vasiliy Kuzmin, Fyodor Kravchenko, Artem Sokolov, Jie Geng

In this paper, we describe the work that we have done to participate in Task1 of the ConferencingSpeech2021 challenge.

Speech Enhancement

Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization

1 code implementation2 Jun 2020 Mayumi Ohta, Nathaniel Berger, Artem Sokolov, Stefan Riezler

Interest in stochastic zeroth-order (SZO) methods has recently been revived in black-box optimization scenarios such as adversarial black-box attacks to deep neural networks.

Sparse Stochastic Zeroth-Order Optimization with an Application to Bandit Structured Prediction

no code implementations12 Jun 2018 Artem Sokolov, Julian Hitschler, Mayumi Ohta, Stefan Riezler

Stochastic zeroth-order (SZO), or gradient-free, optimization allows to optimize arbitrary functions by relying only on function evaluations under parameter perturbations, however, the iteration complexity of SZO methods suffers a factor proportional to the dimensionality of the perturbed function.

Structured Prediction

Sockeye: A Toolkit for Neural Machine Translation

15 code implementations15 Dec 2017 Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton, Matt Post

Written in Python and built on MXNet, the toolkit offers scalable training and inference for the three most prominent encoder-decoder architectures: attentional recurrent neural networks, self-attentional transformers, and fully convolutional networks.

Machine Translation Translation

Counterfactual Learning from Bandit Feedback under Deterministic Logging : A Case Study in Statistical Machine Translation

no code implementations EMNLP 2017 Carolin Lawrence, Artem Sokolov, Stefan Riezler

The goal of counterfactual learning for statistical machine translation (SMT) is to optimize a target SMT system from logged data that consist of user feedback to translations that were predicted by another, historic SMT system.

Machine Translation Structured Prediction +1

Counterfactual Learning from Bandit Feedback under Deterministic Logging: A Case Study in Statistical Machine Translation

no code implementations28 Jul 2017 Carolin Lawrence, Artem Sokolov, Stefan Riezler

The goal of counterfactual learning for statistical machine translation (SMT) is to optimize a target SMT system from logged data that consist of user feedback to translations that were predicted by another, historic SMT system.

Machine Translation Translation

Bandit Structured Prediction for Neural Sequence-to-Sequence Learning

1 code implementation ACL 2017 Julia Kreutzer, Artem Sokolov, Stefan Riezler

Bandit structured prediction describes a stochastic optimization framework where learning is performed from partial feedback.

Domain Adaptation Machine Translation +3

Stochastic Structured Prediction under Bandit Feedback

1 code implementation NeurIPS 2016 Artem Sokolov, Julia Kreutzer, Christopher Lo, Stefan Riezler

Stochastic structured prediction under bandit feedback follows a learning protocol where on each of a sequence of iterations, the learner receives an input, predicts an output structure, and receives partial feedback in form of a task loss evaluation of the predicted structure.

Structured Prediction

Bandit Structured Prediction for Learning from Partial Feedback in Statistical Machine Translation

no code implementations18 Jan 2016 Artem Sokolov, Stefan Riezler, Tanguy Urvoy

We present an application to discriminative reranking in Statistical Machine Translation (SMT) where the learning algorithm only has access to a 1-BLEU loss evaluation of a predicted translation instead of obtaining a gold standard reference translation.

Machine Translation Structured Prediction +1

Cannot find the paper you are looking for? You can Submit a new open access paper.