no code implementations • 25 Dec 2023 • Vincent Plassier, Nikita Kotelevskii, Aleksandr Rubashevskii, Fedor Noskov, Maksim Velikanov, Alexander Fishkov, Samuel Horvath, Martin Takac, Eric Moulines, Maxim Panov
Conformal Prediction (CP) stands out as a robust framework for uncertainty quantification, which is crucial for ensuring the reliability of predictions.
no code implementations • 28 Sep 2023 • Fedor Noskov, Alexander Fishkov, Maxim Panov
Prediction with the possibility of abstention (or selective prediction) is an important problem for error-critical machine learning applications.
no code implementations • 6 May 2022 • Alexander Fishkov, Maxim Panov
Accounting for the uncertainty in the predictions of modern neural networks is a challenging and important task in many domains.
1 code implementation • 7 Feb 2022 • Nikita Kotelevskii, Aleksandr Artemenkov, Kirill Fedyanin, Fedor Noskov, Alexander Fishkov, Artem Shelmanov, Artem Vazhentsev, Aleksandr Petiushko, Maxim Panov
This paper proposes a fast and scalable method for uncertainty quantification of machine learning models' predictions.