Search Results for author: Alexander Terenin

Found 24 papers, 17 papers with code

A Unifying Variational Framework for Gaussian Process Motion Planning

1 code implementation2 Sep 2023 Lucas Cosier, Rares Iordan, Sicelukwanda Zwane, Giovanni Franzese, James T. Wilson, Marc Peter Deisenroth, Alexander Terenin, Yasemin Bekiroglu

To control how a robot moves, motion planning algorithms must compute paths in high-dimensional state spaces while accounting for physical constraints related to motors and joints, generating smooth and stable motions, avoiding obstacles, and preventing collisions.

Gaussian Processes Motion Planning

Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent

1 code implementation NeurIPS 2023 Jihao Andreas Lin, Javier Antorán, Shreyas Padhy, David Janz, José Miguel Hernández-Lobato, Alexander Terenin

Gaussian processes are a powerful framework for quantifying uncertainty and for sequential decision-making but are limited by the requirement of solving linear systems.

Bayesian Optimization Decision Making +1

Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces II: non-compact symmetric spaces

1 code implementation30 Jan 2023 Iskander Azangulov, Andrei Smolensky, Alexander Terenin, Viacheslav Borovitskiy

The invariance of a Gaussian process' covariance to such symmetries gives rise to the most natural generalization of the concept of stationarity to such spaces.

Gaussian Processes

Numerically Stable Sparse Gaussian Processes via Minimum Separation using Cover Trees

1 code implementation14 Oct 2022 Alexander Terenin, David R. Burt, Artem Artemev, Seth Flaxman, Mark van der Wilk, Carl Edward Rasmussen, Hong Ge

For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.

Bayesian Optimization Decision Making +1

Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces I: the compact case

1 code implementation31 Aug 2022 Iskander Azangulov, Andrei Smolensky, Alexander Terenin, Viacheslav Borovitskiy

The invariance of a Gaussian process' covariance to such symmetries gives rise to the most natural generalization of the concept of stationarity to such spaces.

Bayesian Inference Gaussian Processes

Gaussian Processes and Statistical Decision-making in Non-Euclidean Spaces

no code implementations22 Feb 2022 Alexander Terenin

In this dissertation, we develop techniques for broadening the applicability of Gaussian processes.

Decision Making Gaussian Processes

Geometry-aware Bayesian Optimization in Robotics using Riemannian Matérn Kernels

1 code implementation2 Nov 2021 Noémie Jaquier, Viacheslav Borovitskiy, Andrei Smolensky, Alexander Terenin, Tamim Asfour, Leonel Rozo

Bayesian optimization is a data-efficient technique which can be used for control parameter tuning, parametric policy adaptation, and structure design in robotics.

Bayesian Optimization Motion Planning

Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge Independent Projected Kernels

no code implementations NeurIPS 2021 Michael Hutchinson, Alexander Terenin, Viacheslav Borovitskiy, So Takao, Yee Whye Teh, Marc Peter Deisenroth

Gaussian processes are machine learning models capable of learning unknown functions in a way that represents uncertainty, thereby facilitating construction of optimal decision-making systems.

BIG-bench Machine Learning Decision Making +2

Learning Contact Dynamics using Physically Structured Neural Networks

1 code implementation22 Feb 2021 Andreas Hochlehnert, Alexander Terenin, Steindór Sæmundsson, Marc Peter Deisenroth

Learning physically structured representations of dynamical systems that include contact between different objects is an important problem for learning-based approaches in robotics.

Sliced Multi-Marginal Optimal Transport

no code implementations14 Feb 2021 samuel cohen, Alexander Terenin, Yannik Pitcan, Brandon Amos, Marc Peter Deisenroth, K S Sesh Kumar

To construct this distance, we introduce a characterization of the one-dimensional multi-marginal Kantorovich problem and use it to highlight a number of properties of the sliced multi-marginal Wasserstein distance.

Density Estimation Multi-Task Learning

Pathwise Conditioning of Gaussian Processes

2 code implementations8 Nov 2020 James T. Wilson, Viacheslav Borovitskiy, Alexander Terenin, Peter Mostowsky, Marc Peter Deisenroth

As Gaussian processes are used to answer increasingly complex questions, analytic solutions become scarcer and scarcer.

Gaussian Processes

Matérn Gaussian Processes on Graphs

no code implementations29 Oct 2020 Viacheslav Borovitskiy, Iskander Azangulov, Alexander Terenin, Peter Mostowsky, Marc Peter Deisenroth, Nicolas Durrande

Gaussian processes are a versatile framework for learning unknown functions in a manner that permits one to utilize prior information about their properties.

Gaussian Processes

Aligning Time Series on Incomparable Spaces

1 code implementation22 Jun 2020 Samuel Cohen, Giulia Luise, Alexander Terenin, Brandon Amos, Marc Peter Deisenroth

Dynamic time warping (DTW) is a useful method for aligning, comparing and combining time series, but it requires them to live in comparable spaces.

Dynamic Time Warping Imitation Learning +2

Matérn Gaussian processes on Riemannian manifolds

1 code implementation NeurIPS 2020 Viacheslav Borovitskiy, Alexander Terenin, Peter Mostowsky, Marc Peter Deisenroth

Gaussian processes are an effective model class for learning unknown functions, particularly in settings where accurately representing predictive uncertainty is of key importance.

Gaussian Processes

Efficiently Sampling Functions from Gaussian Process Posteriors

5 code implementations ICML 2020 James T. Wilson, Viacheslav Borovitskiy, Alexander Terenin, Peter Mostowsky, Marc Peter Deisenroth

Gaussian processes are the gold standard for many real-world modeling problems, especially in cases where a model's success hinges upon its ability to faithfully represent predictive uncertainty.

Gaussian Processes

Variational Integrator Networks for Physically Structured Embeddings

1 code implementation21 Oct 2019 Steindor Saemundsson, Alexander Terenin, Katja Hofmann, Marc Peter Deisenroth

Learning workable representations of dynamical systems is becoming an increasingly important problem in a number of application areas.

Sparse Parallel Training of Hierarchical Dirichlet Process Topic Models

1 code implementation EMNLP 2020 Alexander Terenin, Måns Magnusson, Leif Jonsson

To scale non-parametric extensions of probabilistic topic models such as Latent Dirichlet allocation to larger data sets, practitioners rely increasingly on parallel and distributed systems.

Topic Models

Techniques for proving Asynchronous Convergence results for Markov Chain Monte Carlo methods

no code implementations17 Nov 2017 Alexander Terenin, Eric P. Xing

Markov Chain Monte Carlo (MCMC) methods such as Gibbs sampling are finding widespread use in applied statistics and machine learning.

Pólya Urn Latent Dirichlet Allocation: a doubly sparse massively parallel sampler

1 code implementation12 Apr 2017 Alexander Terenin, Måns Magnusson, Leif Jonsson, David Draper

We conclude by comparing the performance of our algorithm with that of other approaches on well-known corpora.

Topic Models

GPU-accelerated Gibbs sampling: a case study of the Horseshoe Probit model

1 code implementation15 Aug 2016 Alexander Terenin, Shawfeng Dong, David Draper

Gibbs sampling is a widely used Markov chain Monte Carlo (MCMC) method for numerically approximating integrals of interest in Bayesian statistics and other mathematical sciences.

Computation Distributed, Parallel, and Cluster Computing

Asynchronous Gibbs Sampling

no code implementations30 Sep 2015 Alexander Terenin, Daniel Simpson, David Draper

We introduce a theoretical framework for analyzing asynchronous Gibbs sampling and other extensions of MCMC that do not possess the Markov property.

Computation

Cannot find the paper you are looking for? You can Submit a new open access paper.