Search Results for author: Evgenii Tsymbalov

Found 7 papers, 5 papers with code

Uncertainty Estimation of Transformer Predictions for Misclassification Detection

1 code implementation ACL 2022 Artem Vazhentsev, Gleb Kuzmin, Artem Shelmanov, Akim Tsvigun, Evgenii Tsymbalov, Kirill Fedyanin, Maxim Panov, Alexander Panchenko, Gleb Gusev, Mikhail Burtsev, Manvel Avetisian, Leonid Zhukov

Uncertainty estimation (UE) of model predictions is a crucial step for a variety of tasks such as active learning, misclassification detection, adversarial attack detection, out-of-distribution detection, etc.

Active Learning Adversarial Attack Detection +7

Fact-Checking the Output of Large Language Models via Token-Level Uncertainty Quantification

no code implementations7 Mar 2024 Ekaterina Fadeeva, Aleksandr Rubashevskii, Artem Shelmanov, Sergey Petrakov, Haonan Li, Hamdy Mubarak, Evgenii Tsymbalov, Gleb Kuzmin, Alexander Panchenko, Timothy Baldwin, Preslav Nakov, Maxim Panov

Uncertainty scores leverage information encapsulated in the output of a neural network or its layers to detect unreliable predictions, and we show that they can be used to fact-check the atomic claims in the LLM output.

Fact Checking Hallucination +1

ClimateGPT: Towards AI Synthesizing Interdisciplinary Research on Climate Change

1 code implementation17 Jan 2024 David Thulke, Yingbo Gao, Petrus Pelser, Rein Brune, Rricha Jalota, Floris Fok, Michael Ramos, Ian van Wyk, Abdallah Nasir, Hayden Goldstein, Taylor Tragemann, Katie Nguyen, Ariana Fowler, Andrew Stanco, Jon Gabriel, Jordan Taylor, Dean Moro, Evgenii Tsymbalov, Juliette de Waal, Evgeny Matusov, Mudar Yaghi, Mohammad Shihadah, Hermann Ney, Christian Dugast, Jonathan Dotan, Daniel Erasmus

To increase the accessibility of our model to non-English speakers, we propose to make use of cascaded machine translation and show that this approach can perform comparably to natively multilingual models while being easier to scale to a large number of languages.

Machine Translation Retrieval

Dropout Strikes Back: Improved Uncertainty Estimation via Diversity Sampling

1 code implementation6 Mar 2020 Kirill Fedyanin, Evgenii Tsymbalov, Maxim Panov

Uncertainty estimation for machine learning models is of high importance in many scenarios such as constructing the confidence intervals for model predictions and detection of out-of-distribution or adversarially generated points.

Point Processes

Deeper Connections between Neural Networks and Gaussian Processes Speed-up Active Learning

1 code implementation27 Feb 2019 Evgenii Tsymbalov, Sergei Makarychev, Alexander Shapeev, Maxim Panov

Active learning methods for neural networks are usually based on greedy criteria which ultimately give a single new design point for the evaluation.

Active Learning Gaussian Processes

Dropout-based Active Learning for Regression

no code implementations26 Jun 2018 Evgenii Tsymbalov, Maxim Panov, Alexander Shapeev

Active learning is relevant and challenging for high-dimensional regression models when the annotation of the samples is expensive.

Active Learning regression

Cannot find the paper you are looking for? You can Submit a new open access paper.