Search Results for author: Paul K. Rubenstein

Found 13 papers, 2 papers with code

Learning Translation Quality Evaluation on Low Resource Languages from Large Language Models

no code implementations7 Feb 2023 Amirkeivan Mohtashami, Mauro Verzetti, Paul K. Rubenstein

Learned metrics such as BLEURT have in recent years become widely employed to evaluate the quality of machine translation systems.

Machine Translation Translation

Spatial Consistency Loss for Training Multi-Label Classifiers from Single-Label Annotations

no code implementations11 Mar 2022 Thomas Verelst, Paul K. Rubenstein, Marcin Eichner, Tinne Tuytelaars, Maxim Berman

We show that adding a consistency loss, ensuring that the predictions of the network are consistent over consecutive training epochs, is a simple yet effective method to train multi-label classifiers in a weakly supervised setting.

Data Augmentation Multi-Label Classification +1

On Mutual Information Maximization for Representation Learning

2 code implementations ICLR 2020 Michael Tschannen, Josip Djolonga, Paul K. Rubenstein, Sylvain Gelly, Mario Lucic

Many recent methods for unsupervised or self-supervised representation learning train feature extractors by maximizing an estimate of the mutual information (MI) between different views of the data.

Inductive Bias Representation Learning +1

Practical and Consistent Estimation of f-Divergences

1 code implementation NeurIPS 2019 Paul K. Rubenstein, Olivier Bousquet, Josip Djolonga, Carlos Riquelme, Ilya Tolstikhin

The estimation of an f-divergence between two probability distributions based on samples is a fundamental problem in statistics and machine learning.

BIG-bench Machine Learning Mutual Information Estimation +1

The Incomplete Rosetta Stone Problem: Identifiability Results for Multi-View Nonlinear ICA

no code implementations16 May 2019 Luigi Gresele, Paul K. Rubenstein, Arash Mehrjou, Francesco Locatello, Bernhard Schölkopf

In contrast to known identifiability results for nonlinear ICA, we prove that independent latent sources with arbitrary mixing can be recovered as long as multiple, sufficiently different noisy views are available.

An Empirical Study of Generative Models with Encoders

no code implementations19 Dec 2018 Paul K. Rubenstein, Yunpeng Li, Dominik Roblek

Generative adversarial networks (GANs) are capable of producing high quality image samples.

On the Latent Space of Wasserstein Auto-Encoders

no code implementations11 Feb 2018 Paul K. Rubenstein, Bernhard Schoelkopf, Ilya Tolstikhin

We study the role of latent space dimensionality in Wasserstein auto-encoders (WAEs).


Probabilistic Active Learning of Functions in Structural Causal Models

no code implementations30 Jun 2017 Paul K. Rubenstein, Ilya Tolstikhin, Philipp Hennig, Bernhard Schoelkopf

We consider the problem of learning the functions computing children from parents in a Structural Causal Model once the underlying causal graph has been identified.

Active Learning Causal Discovery +1

From Deterministic ODEs to Dynamic Structural Causal Models

no code implementations29 Aug 2016 Paul K. Rubenstein, Stephan Bongers, Bernhard Schoelkopf, Joris M. Mooij

Structural Causal Models are widely used in causal modelling, but how they relate to other modelling tools is poorly understood.

A Kernel Test for Three-Variable Interactions with Random Processes

no code implementations2 Mar 2016 Paul K. Rubenstein, Kacper P. Chwialkowski, Arthur Gretton

The main contributions of this paper are twofold: first, we prove that the Lancaster statistic satisfies the conditions required to estimate the quantiles of the null distribution using the wild bootstrap; second, the manner in which this is proved is novel, simpler than existing methods, and can further be applied to other statistics.


Cannot find the paper you are looking for? You can Submit a new open access paper.