NeurIPS 2018 • nbansal90/Can-we-Gain-More-from-Orthogonality •

This paper seeks to answer the question: as the (near-) orthogonality of weights is found to be a favorable property for training deep convolutional neural networks, how can we enforce it in more effective and easy-to-use ways?

NeurIPS 2018 • ginn24/Pelee-TensorRT

In this study, we propose an efficient architecture named PeleeNet, which is built with conventional convolution instead.

NeurIPS 2018 • uds-lsv/evaluating-logit-pairing-methods •

In this paper, we develop improved techniques for defending against adversarial examples at scale.

NeurIPS 2018 • treforevans/direct •

We explore a new research direction in Bayesian variational inference with discrete latent variable priors where we exploit Kronecker matrix algebra for efficient and exact computations of the evidence lower bound (ELBO).

NeurIPS 2018 • alan-turing-institute/bocpdms

The resulting inference procedure is doubly robust for both the parameter and the changepoint (CP) posterior, with linear time and constant space complexity.

NeurIPS 2018 • serre-lab/hgru_share

As a prime example, convolutional neural networks, a type of feedforward neural networks, are now approaching -- and sometimes even surpassing -- human accuracy on a variety of visual recognition tasks.

NeurIPS 2018 • eminorhan/simple-cache •

We propose to extract this extra class-relevant information using a simple key-value cache memory to improve the classification performance of the model at test time.

NeurIPS 2018 • srmcc/deterministic-ridge-leverage-sampling

We also show that under the assumption of power-law decay of ridge leverage scores, this deterministic algorithm is provably as accurate as randomized algorithms.

NeurIPS 2018 • stegua/dpartion-nips2018

This paper presents a novel method to compute the exact Kantorovich-Wasserstein distance between a pair of $d$-dimensional histograms having $n$ bins each.

NeurIPS 2018 • lelouedec/3DNetworksPytorch •

The proposed method is a generalization of typical CNNs to feature learning from point clouds, thus we call it PointCNN.