NeurIPS 2017

LightGBM: A Highly Efficient Gradient Boosting Decision Tree

NeurIPS 2017 Microsoft/LightGBM

We prove that, since the data instances with larger gradients play a more important role in the computation of information gain, GOSS can obtain quite accurate estimation of the information gain with a much smaller data size.

A Unified Approach to Interpreting Model Predictions

NeurIPS 2017 slundberg/shap

Understanding why a model makes a certain prediction can be as crucial as the prediction's accuracy in many applications.

FEATURE IMPORTANCE INTERPRETABLE MACHINE LEARNING

ELF: An Extensive, Lightweight and Flexible Research Platform for Real-time Strategy Games

NeurIPS 2017 facebookresearch/ELF

In addition, our platform is flexible in terms of environment-agent communication topologies, choices of RL methods, changes in game parameters, and can host existing C/C++-based game environments like Arcade Learning Environment.

ATARI GAMES STARCRAFT

Self-Normalizing Neural Networks

NeurIPS 2017 bioinf-jku/SNNs

We introduce self-normalizing neural networks (SNNs) to enable high-level abstract representations.

DRUG DISCOVERY PULSAR PREDICTION

Bayesian GAN

NeurIPS 2017 andrewgordonwilson/bayesgan

Generative adversarial networks (GANs) can implicitly learn rich distributions over images, audio, and data which are hard to model with an explicit likelihood.

Toward Multimodal Image-to-Image Translation

NeurIPS 2017 junyanz/BicycleGAN

Our proposed method encourages bijective consistency between the latent encoding and output modes.

IMAGE-TO-IMAGE TRANSLATION

Inductive Representation Learning on Large Graphs

NeurIPS 2017 williamleif/GraphSAGE

Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions.

LINK PREDICTION NODE CLASSIFICATION REPRESENTATION LEARNING

Learning Disentangled Representations with Semi-Supervised Deep Generative Models

NeurIPS 2017 probtorch/probtorch

We propose to learn such representations using model architectures that generalise from standard VAEs, employing a general graphical model structure in the encoder and decoder.

Dynamic Routing Between Capsules

NeurIPS 2017 timomernick/pytorch-capsule

We use the length of the activity vector to represent the probability that the entity exists and its orientation to represent the instantiation parameters.

IMAGE CLASSIFICATION

Prototypical Networks for Few-shot Learning

NeurIPS 2017 orobix/Prototypical-Networks-for-Few-shot-Learning-PyTorch

We propose prototypical networks for the problem of few-shot classification, where a classifier must generalize to new classes not seen in the training set, given only a small number of examples of each new class.

FEW-SHOT IMAGE CLASSIFICATION ONE-SHOT LEARNING ZERO-SHOT LEARNING