Search Results for author: Oleksii Hrinchuk

Found 18 papers, 7 papers with code

NVIDIA NeMo Offline Speech Translation Systems for IWSLT 2022

no code implementations IWSLT (ACL) 2022 Oleksii Hrinchuk, Vahid Noroozi, Ashwinkumar Ganesan, Sarah Campbell, Sandeep Subramanian, Somshubra Majumdar, Oleksii Kuchaiev

Our cascade system consists of 1) Conformer RNN-T automatic speech recognition model, 2) punctuation-capitalization model based on pre-trained T5 encoder, 3) ensemble of Transformer neural machine translation models fine-tuned on TED talks.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +4

Leveraging Synthetic Targets for Machine Translation

no code implementations7 May 2023 Sarthak Mittal, Oleksii Hrinchuk, Oleksii Kuchaiev

In this work, we provide a recipe for training machine translation models in a limited resource setting by leveraging synthetic target data generated using a large pre-trained model.

Machine Translation Translation

NVIDIA NeMo Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21

no code implementations16 Nov 2021 Sandeep Subramanian, Oleksii Hrinchuk, Virginia Adams, Oleksii Kuchaiev

This paper provides an overview of NVIDIA NeMo's neural machine translation systems for the constrained data track of the WMT21 News and Biomedical Shared Translation Tasks.

Data Augmentation Knowledge Distillation +3

Tensorized Embedding Layers

no code implementations Findings of the Association for Computational Linguistics 2020 Oleksii Hrinchuk, Valentin Khrulkov, Leyla Mirvakhabova, Elena Orlova, Ivan Oseledets

The embedding layers transforming input words into real vectors are the key components of deep neural networks used in natural language processing.

Catalyst.RL: A Distributed Framework for Reproducible RL Research

1 code implementation28 Feb 2019 Sergey Kolesnikov, Oleksii Hrinchuk

Despite the recent progress in deep reinforcement learning field (RL), and, arguably because of it, a large body of work remains to be done in reproducing and carefully comparing different RL algorithms.

Continuous Control

Tensorized Embedding Layers for Efficient Model Compression

1 code implementation30 Jan 2019 Oleksii Hrinchuk, Valentin Khrulkov, Leyla Mirvakhabova, Elena Orlova, Ivan Oseledets

The embedding layers transforming input words into real vectors are the key components of deep neural networks used in natural language processing.

Language Modelling Machine Translation +2

Riemannian Optimization for Skip-Gram Negative Sampling

1 code implementation ACL 2017 Alexander Fonarev, Oleksii Hrinchuk, Gleb Gusev, Pavel Serdyukov, Ivan Oseledets

Skip-Gram Negative Sampling (SGNS) word embedding model, well known by its implementation in "word2vec" software, is usually optimized by stochastic gradient descent.

Riemannian optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.