Search Results for author: Daniil Moskovskiy

Found 6 papers, 5 papers with code

ParaDetox: Detoxification with Parallel Data

1 code implementation ACL 2022 Varvara Logacheva, Daryna Dementieva, Sergey Ustyantsev, Daniil Moskovskiy, David Dale, Irina Krotova, Nikita Semenov, Alexander Panchenko

To the best of our knowledge, these are the first parallel datasets for this task. We describe our pipeline in detail to make it fast to set up for a new language or domain, thus contributing to faster and easier development of new parallel resources. We train several detoxification models on the collected data and compare them with several baselines and state-of-the-art unsupervised approaches.


Exploring Cross-lingual Text Detoxification with Large Multilingual Language Models.

1 code implementation ACL 2022 Daniil Moskovskiy, Daryna Dementieva, Alexander Panchenko

This work investigates multilingual and cross-lingual detoxification and the behavior of large multilingual models in this setting.

Style Transfer

MERA: A Comprehensive LLM Evaluation in Russian

1 code implementation9 Jan 2024 Alena Fenogenova, Artem Chervyakov, Nikita Martynov, Anastasia Kozlova, Maria Tikhonova, Albina Akhmetgareeva, Anton Emelyanov, Denis Shevelev, Pavel Lebedev, Leonid Sinev, Ulyana Isaeva, Katerina Kolomeytseva, Daniil Moskovskiy, Elizaveta Goncharova, Nikita Savushkin, Polina Mikhailova, Denis Dimitrov, Alexander Panchenko, Sergei Markov

To address these issues, we introduce an open Multimodal Evaluation of Russian-language Architectures (MERA), a new instruction benchmark for evaluating foundation models oriented towards the Russian language.

Exploring Cross-lingual Textual Style Transfer with Large Multilingual Language Models

1 code implementation5 Jun 2022 Daniil Moskovskiy, Daryna Dementieva, Alexander Panchenko

However, models are not able to perform cross-lingual detoxification and direct fine-tuning on exact language is inevitable.

Style Transfer

Cannot find the paper you are looking for? You can Submit a new open access paper.