NeurIPS 2017

Filtering Variational Objectives

NeurIPS 2017 tensorflow/models

When used as a surrogate objective for maximum likelihood estimation in latent variable models, the evidence lower bound (ELBO) produces state-of-the-art results.

REBAR: Low-variance, unbiased gradient estimates for discrete latent variable models

NeurIPS 2017 tensorflow/models

Learning in models with discrete latent variables is challenging due to high variance gradient estimators.

LightGBM: A Highly Efficient Gradient Boosting Decision Tree

NeurIPS 2017 Microsoft/LightGBM

We prove that, since the data instances with larger gradients play a more important role in the computation of information gain, GOSS can obtain quite accurate estimation of the information gain with a much smaller data size.

Attention Is All You Need

NeurIPS 2017 tensorflow/tensor2tensor

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

CONSTITUENCY PARSING MACHINE TRANSLATION

GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

NeurIPS 2017 jantic/DeOldify

Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible.

IMAGE GENERATION

A Unified Approach to Interpreting Model Predictions

NeurIPS 2017 slundberg/shap

Understanding why a model makes a certain prediction can be as crucial as the prediction's accuracy in many applications.

FEATURE IMPORTANCE INTERPRETABLE MACHINE LEARNING

Improved Training of Wasserstein GANs

NeurIPS 2017 tensorpack/tensorpack

Generative Adversarial Networks (GANs) are powerful generative models, but suffer from training instability.

CONDITIONAL IMAGE GENERATION

Dynamic Routing Between Capsules

NeurIPS 2017 naturomics/CapsNet-Tensorflow

We use the length of the activity vector to represent the probability that the entity exists and its orientation to represent the instantiation parameters.

IMAGE CLASSIFICATION

ELF: An Extensive, Lightweight and Flexible Research Platform for Real-time Strategy Games

NeurIPS 2017 facebookresearch/ELF

In addition, our platform is flexible in terms of environment-agent communication topologies, choices of RL methods, changes in game parameters, and can host existing C/C++-based game environments like Arcade Learning Environment.

ATARI GAMES STARCRAFT

Self-Normalizing Neural Networks

NeurIPS 2017 bioinf-jku/SNNs

We introduce self-normalizing neural networks (SNNs) to enable high-level abstract representations.

DRUG DISCOVERY PULSAR PREDICTION