NeurIPS 2016

Deep Exploration via Bootstrapped DQN

NeurIPS 2016 tensorflow/models

Efficient exploration in complex environments remains a major challenge for reinforcement learning.

ATARI GAMES EFFICIENT EXPLORATION

Unsupervised Learning for Physical Interaction through Video Prediction

NeurIPS 2016 tensorflow/models

A core challenge for an agent learning to interact with the world is to predict how its actions affect objects in its environment.

VIDEO PREDICTION

Visual Dynamics: Probabilistic Future Frame Synthesis via Cross Convolutional Networks

NeurIPS 2016 tensorflow/models

We study the problem of synthesizing a number of likely future frames from a single input image.

Can Active Memory Replace Attention?

NeurIPS 2016 tensorflow/models

Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years.

IMAGE CAPTIONING MACHINE TRANSLATION

Domain Separation Networks

NeurIPS 2016 tensorflow/models

However, by focusing only on creating a mapping or shared representation between the two domains, they ignore the individual characteristics of each domain.

UNSUPERVISED DOMAIN ADAPTATION

Memory-Efficient Backpropagation Through Time

NeurIPS 2016 openai/gradient-checkpointing

We propose a novel approach to reduce memory consumption of the backpropagation through time (BPTT) algorithm when training recurrent neural networks (RNNs).

Dynamic Network Surgery for Efficient DNNs

NeurIPS 2016 NervanaSystems/distiller

In this paper, we propose a novel network compression method called dynamic network surgery, which can remarkably reduce the network complexity by making on-the-fly connection pruning.

InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets

NeurIPS 2016 openai/InfoGAN

This paper describes InfoGAN, an information-theoretic extension to the Generative Adversarial Network that is able to learn disentangled representations in a completely unsupervised manner.

IMAGE GENERATION REPRESENTATION LEARNING

Higher-Order Factorization Machines

NeurIPS 2016 geffy/tffm

Factorization machines (FMs) are a supervised learning approach that can use second-order feature combinations even when the data is very high-dimensional.

LINK PREDICTION

Improved Variational Inference with Inverse Autoregressive Flow

NeurIPS 2016 openai/iaf

The framework of normalizing flows provides a general strategy for flexible variational inference of posteriors over latent variables.

IMAGE GENERATION