Search Results for author: Daniil Moskovskiy

Found 9 papers, 6 papers with code

ParaDetox: Detoxification with Parallel Data

1 code implementation ACL 2022 Varvara Logacheva, Daryna Dementieva, Sergey Ustyantsev, Daniil Moskovskiy, David Dale, Irina Krotova, Nikita Semenov, Alexander Panchenko

To the best of our knowledge, these are the first parallel datasets for this task. We describe our pipeline in detail to make it fast to set up for a new language or domain, thus contributing to faster and easier development of new parallel resources. We train several detoxification models on the collected data and compare them with several baselines and state-of-the-art unsupervised approaches.

Sentence

Exploring Cross-lingual Text Detoxification with Large Multilingual Language Models.

1 code implementation ACL 2022 Daniil Moskovskiy, Daryna Dementieva, Alexander Panchenko

This work investigates multilingual and cross-lingual detoxification and the behavior of large multilingual models in this setting.

Style Transfer

How Much Knowledge Can You Pack into a LoRA Adapter without Harming LLM?

1 code implementation20 Feb 2025 Sergey Pletenev, Maria Marina, Daniil Moskovskiy, Vasily Konovalov, Pavel Braslavski, Alexander Panchenko, Mikhail Salnikov

The performance of Large Language Models (LLMs) on many tasks is greatly limited by the knowledge learned during pre-training and stored in the model's parameters.

Question Answering

Multilingual and Explainable Text Detoxification with Parallel Corpora

1 code implementation16 Dec 2024 Daryna Dementieva, Nikolay Babakov, Amit Ronen, Abinew Ali Ayele, Naquee Rizwan, Florian Schneider, Xintong Wang, Seid Muhie Yimam, Daniil Moskovskiy, Elisei Stakovskii, Eran Kaufman, Ashraf Elnagar, Animesh Mukherjee, Alexander Panchenko

Even with various regulations in place across countries and social media platforms (Government of India, 2021; European Parliament and Council of the European Union, 2022, digital abusive speech remains a significant issue.

Descriptive Style Transfer +1

MERA: A Comprehensive LLM Evaluation in Russian

no code implementations9 Jan 2024 Alena Fenogenova, Artem Chervyakov, Nikita Martynov, Anastasia Kozlova, Maria Tikhonova, Albina Akhmetgareeva, Anton Emelyanov, Denis Shevelev, Pavel Lebedev, Leonid Sinev, Ulyana Isaeva, Katerina Kolomeytseva, Daniil Moskovskiy, Elizaveta Goncharova, Nikita Savushkin, Polina Mikhailova, Denis Dimitrov, Alexander Panchenko, Sergei Markov

To address these issues, we introduce an open Multimodal Evaluation of Russian-language Architectures (MERA), a new instruction benchmark for evaluating foundation models oriented towards the Russian language.

Exploring Cross-lingual Textual Style Transfer with Large Multilingual Language Models

1 code implementation5 Jun 2022 Daniil Moskovskiy, Daryna Dementieva, Alexander Panchenko

However, models are not able to perform cross-lingual detoxification and direct fine-tuning on exact language is inevitable.

Style Transfer

Cannot find the paper you are looking for? You can Submit a new open access paper.