Search Results for author: Alexander Shekhovtsov

Found 21 papers, 2 papers with code

Bias-Variance Tradeoffs in Single-Sample Binary Gradient Estimators

no code implementations7 Oct 2021 Alexander Shekhovtsov

Discrete and especially binary random variables occur in many machine learning models, notably in variational autoencoders with binary latent states and in stochastic binary networks.

VAE Approximation Error: ELBO and Exponential Families

no code implementations ICLR 2022 Alexander Shekhovtsov, Dmitrij Schlesinger, Boris Flach

The importance of Variational Autoencoders reaches far beyond standalone generative models -- the approach is also used for learning latent representations and can be generalized to semi-supervised learning.

Reintroducing Straight-Through Estimators as Principled Methods for Stochastic Binary Networks

no code implementations11 Jun 2020 Alexander Shekhovtsov, Viktor Yanush

Training neural networks with binary weights and activations is a challenging problem due to the lack of gradients and difficulty of optimization over discrete weights.

Variational Inference

Path Sample-Analytic Gradient Estimators for Stochastic Binary Networks

no code implementations NeurIPS 2020 Alexander Shekhovtsov, Viktor Yanush, Boris Flach

In neural networks with binary activations and or binary weights the training by gradient descent is complicated as the model has piecewise constant response.

Taxonomy of Dual Block-Coordinate Ascent Methods for Discrete Energy Minimization

1 code implementation16 Apr 2020 Siddharth Tourani, Alexander Shekhovtsov, Carsten Rother, Bogdan Savchynskyy

We consider the maximum-a-posteriori inference problem in discrete graphical models and study solvers based on the dual block-coordinate ascent rule.

MPLP++: Fast, Parallel Dual Block-Coordinate Ascent for Dense Graphical Models

no code implementations ECCV 2018 Siddharth Tourani, Alexander Shekhovtsov, Carsten Rother, Bogdan Savchynskyy

Dense, discrete Graphical Models with pairwise potentials are a powerful class of models which are employed in state-of-the-art computer vision and bio-imaging applications.

6D Pose Estimation using RGB

Belief Propagation Reloaded: Learning BP-Layers for Labeling Problems

1 code implementation13 Mar 2020 Patrick Knöbelreiter, Christian Sormann, Alexander Shekhovtsov, Friedrich Fraundorfer, Thomas Pock

It has been proposed by many researchers that combining deep neural networks with graphical models can create more efficient and better regularized composite models.

Optical Flow Estimation Semantic Segmentation

Feed-forward Propagation in Probabilistic Neural Networks with Categorical and Max Layers

no code implementations ICLR 2019 Alexander Shekhovtsov, Boris Flach

Probabilistic Neural Networks deal with various sources of stochasticity: input noise, dropout, stochastic neurons, parameter uncertainties modeled as random variables, etc.

Stochastic Normalizations as Bayesian Learning

no code implementations1 Nov 2018 Alexander Shekhovtsov, Boris Flach

In this work we investigate the reasons why Batch Normalization (BN) improves the generalization performance of deep networks.

Normalization of Neural Networks using Analytic Variance Propagation

no code implementations28 Mar 2018 Alexander Shekhovtsov, Boris Flach

We address the problem of estimating statistics of hidden units in a neural network using a method of analytic moment propagation.

Feed-forward Uncertainty Propagation in Belief and Neural Networks

no code implementations28 Mar 2018 Alexander Shekhovtsov, Boris Flach, Michal Busta

We propose a feed-forward inference method applicable to belief and neural networks.

Generative learning for deep networks

no code implementations25 Sep 2017 Boris Flach, Alexander Shekhovtsov, Ondrej Fikar

Learning, taking into account full distribution of the data, referred to as generative, is not feasible with deep neural networks (DNNs) because they model only the conditional distribution of the outputs given the inputs.

Bayesian Inference

Scalable Full Flow with Learned Binary Descriptors

no code implementations20 Jul 2017 Gottfried Munda, Alexander Shekhovtsov, Patrick Knöbelreiter, Thomas Pock

We tackle the computation- and memory-intensive operations on the 4D cost volume by a min-projection which reduces memory complexity from quadratic to linear and binary descriptors for efficient matching.

Optical Flow Estimation

Complexity of Discrete Energy Minimization Problems

no code implementations29 Jul 2016 Mengtian Li, Alexander Shekhovtsov, Daniel Huber

Specifically, we show that general energy minimization, even in the 2-label pairwise case, and planar energy minimization with three or more labels are exp-APX-complete.

Joint M-Best-Diverse Labelings as a Parametric Submodular Minimization

no code implementations NeurIPS 2016 Alexander Kirillov, Alexander Shekhovtsov, Carsten Rother, Bogdan Savchynskyy

In particular, the joint M-best diverse labelings can be obtained by running a non-parametric submodular minimization (in the special case - max-flow) solver for M different values of $\gamma$ in parallel, for certain diversity measures.

Solving Dense Image Matching in Real-Time using Discrete-Continuous Optimization

no code implementations23 Jan 2016 Alexander Shekhovtsov, Christian Reinbacher, Gottfried Graber, Thomas Pock

Dense image matching is a fundamental low-level problem in Computer Vision, which has received tremendous attention from both discrete and continuous optimization communities.

Optical Flow Estimation Stereo Matching +1

Maximum Persistency via Iterative Relaxed Inference with Graphical Models

no code implementations CVPR 2015 Alexander Shekhovtsov, Paul Swoboda, Bogdan Savchynskyy

We propose an efficient implementation, which runs in time comparable to a single run of a suboptimal dual solver.

Higher Order Maximum Persistency and Comparison Theorems

no code implementations4 May 2015 Alexander Shekhovtsov

For polyhedral relaxations of such problems it is generally not true that variables integer in the relaxed solution will retain the same values in the optimal discrete solution.

Maximum Persistency in Energy Minimization

no code implementations CVPR 2014 Alexander Shekhovtsov

We propose a new sufficient condition for partial optimality which is: (1) verifiable in polynomial time (2) invariant to reparametrization of the problem and permutation of labels and (3) includes many existing sufficient conditions as special cases.

Cannot find the paper you are looking for? You can Submit a new open access paper.