NeurIPS 2013

Learning to Prune in Metric and Non-Metric Spaces

NeurIPS 2013 searchivarius/NonMetricSpaceLib

Our focus is on approximate nearest neighbor retrieval in metric and non-metric spaces.


Multi-Task Bayesian Optimization

NeurIPS 2013 HIPS/Spearmint

We demonstrate the utility of this new acquisition function by utilizing a small dataset in order to explore hyperparameter settings for a large dataset.


Translating Embeddings for Modeling Multi-relational Data

NeurIPS 2013 Accenture/AmpliGraph

We consider the problem of embedding entities and relationships of multi-relational data in low-dimensional vector spaces.


Distributed Representations of Words and Phrases and their Compositionality

NeurIPS 2013 RaRe-Technologies/gensim-data

Motivated by this example, we present a simple method for finding phrases in text, and show that learning good vector representations for millions of phrases is possible.

Zero-Shot Learning Through Cross-Modal Transfer

NeurIPS 2013 mganjoo/zslearning

This work introduces a model that can recognize objects in images even if no training data is available for the object class.


Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process Mixture

NeurIPS 2013 trevorcampbell/dynamic-means

This paper presents a novel algorithm, based upon the dependent Dirichlet process mixture model (DDPMM), for clustering batch-sequential data containing an unknown number of evolving clusters.

Bayesian entropy estimation for binary spike train data using parametric prior knowledge

NeurIPS 2013 pillowlab/CDMentropy

Shannon's entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes.

Stochastic blockmodel approximation of a graphon: Theory and consistent estimation

NeurIPS 2013 airoldilab/SBA

Given a convergent sequence of graphs, there exists a limit object called the graphon from which random graphs are generated.

Generalized Denoising Auto-Encoders as Generative Models

NeurIPS 2013 cycentum/bert-based-text-generation

Recent work has shown how denoising and contractive autoencoders implicitly capture the structure of the data-generating density, in the case where the corruption noise is Gaussian, the reconstruction error is the squared error, and the data is continuous-valued.


Phase Retrieval using Alternating Minimization

NeurIPS 2013 Jay-Lewis/phase_retrieval

Empirically, we demonstrate that alternating minimization performs similar to recently proposed convex techniques for this problem (which are based on "lifting" to a convex matrix problem) in sample complexity and robustness to noise.