Search Results for author: Kilian Q. Weinberger

Found 69 papers, 38 papers with code

Online Adaptation to Label Distribution Shift

no code implementations NeurIPS 2021 Ruihan Wu, Chuan Guo, Yi Su, Kilian Q. Weinberger

Machine learning models often encounter distribution shifts when deployed in the real world.

Towards Deeper Deep Reinforcement Learning with Spectral Normalization

no code implementations NeurIPS 2021 Johan Bjorck, Carla P. Gomes, Kilian Q. Weinberger

In this paper we investigate how RL agents are affected by exchanging the small MLPs with larger modern networks with skip connections and normalization, focusing specifically on actor-critic algorithms.

Low-Precision Reinforcement Learning: Running Soft Actor-Critic in Half Precision

no code implementations26 Feb 2021 Johan Bjorck, Xiangyu Chen, Christopher De Sa, Carla P. Gomes, Kilian Q. Weinberger

Low-precision training has become a popular approach to reduce compute requirements, memory footprint, and energy consumption in supervised learning.

Continuous Control

Making Paper Reviewing Robust to Bid Manipulation Attacks

1 code implementation9 Feb 2021 Ruihan Wu, Chuan Guo, Felix Wu, Rahul Kidambi, Laurens van der Maaten, Kilian Q. Weinberger

We develop a novel approach for paper bidding and assignment that is much more robust against such attacks.

Correlator Convolutional Neural Networks: An Interpretable Architecture for Image-like Quantum Matter Data

1 code implementation6 Nov 2020 Cole Miles, Annabelle Bohrdt, Ruihan Wu, Christie Chiu, Muqing Xu, Geoffrey Ji, Markus Greiner, Kilian Q. Weinberger, Eugene Demler, Eun-Ah Kim

Machine learning models are a powerful theoretical tool for analyzing data from quantum simulators, in which results of experiments are sets of snapshots of many-body states.

Deep Co-Training with Task Decomposition for Semi-Supervised Domain Adaptation

1 code implementation ICCV 2021 Luyu Yang, Yan Wang, Mingfei Gao, Abhinav Shrivastava, Kilian Q. Weinberger, Wei-Lun Chao, Ser-Nam Lim

To integrate the strengths of the two classifiers, we apply the well-established co-training framework, in which the two classifiers exchange their high confident predictions to iteratively "teach each other" so that both classifiers can excel in the target domain.

Unsupervised Domain Adaptation

Revisiting Few-sample BERT Fine-tuning

1 code implementation ICLR 2021 Tianyi Zhang, Felix Wu, Arzoo Katiyar, Kilian Q. Weinberger, Yoav Artzi

We empirically test the impact of these factors, and identify alternative practices that resolve the commonly observed instability of the process.

Train in Germany, Test in The USA: Making 3D Object Detectors Generalize

1 code implementation CVPR 2020 Yan Wang, Xiangyu Chen, Yurong You, Li Erran, Bharath Hariharan, Mark Campbell, Kilian Q. Weinberger, Wei-Lun Chao

In the domain of autonomous driving, deep learning has substantially improved the 3D object detection accuracy for LiDAR and stereo camera data alike.

3D Object Detection Autonomous Driving

On Feature Normalization and Data Augmentation

1 code implementation CVPR 2021 Boyi Li, Felix Wu, Ser-Nam Lim, Serge Belongie, Kilian Q. Weinberger

The moments (a. k. a., mean and standard deviation) of latent features are often removed as noise when training image recognition models, to increase stability and reduce training time.

Data Augmentation Domain Generalization +2

On Hiding Neural Networks Inside Neural Networks

no code implementations24 Feb 2020 Chuan Guo, Ruihan Wu, Kilian Q. Weinberger

Modern neural networks often contain significantly more parameters than the size of their training data.

Identifying Mislabeled Data using the Area Under the Margin Ranking

2 code implementations NeurIPS 2020 Geoff Pleiss, Tianyi Zhang, Ethan R. Elenberg, Kilian Q. Weinberger

Not all data in a typical training set help with generalization; some samples can be overly ambiguous or outrightly mislabeled.

Convolutional Networks with Dense Connectivity

no code implementations8 Jan 2020 Gao Huang, Zhuang Liu, Geoff Pleiss, Laurens van der Maaten, Kilian Q. Weinberger

Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output.

Object Recognition

TrojanNet: Exposing the Danger of Trojan Horse Attack on Neural Networks

no code implementations ICLR 2020 Chuan Guo, Ruihan Wu, Kilian Q. Weinberger

The complexity of large-scale neural networks can lead to poor understanding of their internal details.

LDLS: 3-D Object Segmentation Through Label Diffusion From 2-D Images

1 code implementation30 Oct 2019 Brian H. Wang, Wei-Lun Chao, Yan Wang, Bharath Hariharan, Kilian Q. Weinberger, Mark Campbell

We obtain 2-D segmentation predictions by applying Mask-RCNN to the RGB image, and then link this image to a 3-D lidar point cloud by building a graph of connections among 3-D points and 2-D pixels.

Point Cloud Segmentation Semantic Segmentation

A New Defense Against Adversarial Images: Turning a Weakness into a Strength

1 code implementation NeurIPS 2019 Tao Yu, Shengyuan Hu, Chuan Guo, Wei-Lun Chao, Kilian Q. Weinberger

Natural images are virtually surrounded by low-density misclassified regions that can be efficiently discovered by gradient-guided search --- enabling the generation of adversarial images.

Adversarial Defense

Integrated Triaging for Fast Reading Comprehension

no code implementations28 Sep 2019 Felix Wu, Boyi Li, Lequn Wang, Ni Lao, John Blitzer, Kilian Q. Weinberger

This paper introduces Integrated Triaging, a framework that prunes almost all context in early layers of a network, leaving the remaining (deep) layers to scan only a tiny fraction of the full corpus.

Machine Reading Comprehension

Neural Network Out-of-Distribution Detection for Regression Tasks

no code implementations25 Sep 2019 Geoff Pleiss, Amauri Souza, Joseph Kim, Boyi Li, Kilian Q. Weinberger

Neural network out-of-distribution (OOD) detection aims to identify when a model is unable to generalize to new inputs, either due to covariate shift or anomalous data.

Out-of-Distribution Detection

Detecting Noisy Training Data with Loss Curves

no code implementations25 Sep 2019 Geoff Pleiss, Tianyi Zhang, Ethan R. Elenberg, Kilian Q. Weinberger

This paper introduces a new method to discover mislabeled training samples and to mitigate their impact on the training process of deep networks.

Positional Normalization

2 code implementations NeurIPS 2019 Boyi Li, Felix Wu, Kilian Q. Weinberger, Serge Belongie

A popular method to reduce the training time of deep neural networks is to normalize activations at each layer.

Simple Black-box Adversarial Attacks

3 code implementations ICLR 2019 Chuan Guo, Jacob R. Gardner, Yurong You, Andrew Gordon Wilson, Kilian Q. Weinberger

We propose an intriguingly simple method for the construction of adversarial images in the black-box setting.

FastFusionNet: New State-of-the-Art for DAWNBench SQuAD

2 code implementations28 Feb 2019 Felix Wu, Boyi Li, Lequn Wang, Ni Lao, John Blitzer, Kilian Q. Weinberger

In this technical report, we introduce FastFusionNet, an efficient variant of FusionNet [12].

Reading Comprehension

Simplifying Graph Convolutional Networks

3 code implementations19 Feb 2019 Felix Wu, Tianyi Zhang, Amauri Holanda de Souza Jr., Christopher Fifty, Tao Yu, Kilian Q. Weinberger

Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations.

Ranked #2 on Text Classification on 20NEWS (using extra training data)

Graph Regression Image Classification +5

Gradient Regularized Budgeted Boosting

no code implementations13 Jan 2019 Zhixiang Eddie Xu, Matt J. Kusner, Kilian Q. Weinberger, Alice X. Zheng

As machine learning transitions increasingly towards real world applications controlling the test-time cost of algorithms becomes more and more crucial.

Gradient Boosted Feature Selection

no code implementations13 Jan 2019 Zhixiang Eddie Xu, Gao Huang, Kilian Q. Weinberger, Alice X. Zheng

A feature selection algorithm should ideally satisfy four conditions: reliably extract relevant features; be able to identify non-linear feature interactions; scale linearly with the number of features and dimensions; allow the incorporation of known sparsity structure.

Anytime Stereo Image Depth Estimation on Mobile Devices

1 code implementation26 Oct 2018 Yan Wang, Zihang Lai, Gao Huang, Brian H. Wang, Laurens van der Maaten, Mark Campbell, Kilian Q. Weinberger

Many applications of stereo depth estimation in robotics require the generation of accurate disparity maps in real time under significant computational constraints.

Stereo Depth Estimation

Deep Person Re-identification for Probabilistic Data Association in Multiple Pedestrian Tracking

no code implementations19 Oct 2018 Brian H. Wang, Yan Wang, Kilian Q. Weinberger, Mark Campbell

We present a data association method for vision-based multiple pedestrian tracking, using deep convolutional features to distinguish between different people based on their appearances.

Person Re-Identification Translation

GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration

2 code implementations NeurIPS 2018 Jacob R. Gardner, Geoff Pleiss, David Bindel, Kilian Q. Weinberger, Andrew Gordon Wilson

Despite advances in scalable models, the inference tools used for Gaussian processes (GPs) have yet to fully capitalize on developments in computing hardware.

Gaussian Processes

Low Frequency Adversarial Perturbation

1 code implementation24 Sep 2018 Chuan Guo, Jared S. Frank, Kilian Q. Weinberger

In this paper we propose to restrict the search for adversarial images to a low frequency domain.

Denoising Speech Recognition

Understanding Batch Normalization

no code implementations NeurIPS 2018 Johan Bjorck, Carla Gomes, Bart Selman, Kilian Q. Weinberger

Batch normalization (BN) is a technique to normalize activations in intermediate layers of deep neural networks.

Resource Aware Person Re-identification across Multiple Resolutions

1 code implementation CVPR 2018 Yan Wang, Lequn Wang, Yurong You, Xu Zou, Vincent Chen, Serena Li, Gao Huang, Bharath Hariharan, Kilian Q. Weinberger

Not all people are equally easy to identify: color statistics might be enough for some cases while others might require careful reasoning about high- and low-level details.

Person Re-Identification

Constant-Time Predictive Distributions for Gaussian Processes

1 code implementation ICML 2018 Geoff Pleiss, Jacob R. Gardner, Kilian Q. Weinberger, Andrew Gordon Wilson

One of the most compelling features of Gaussian process (GP) regression is its ability to provide well-calibrated posterior distributions.

Gaussian Processes

Product Kernel Interpolation for Scalable Gaussian Processes

1 code implementation24 Feb 2018 Jacob R. Gardner, Geoff Pleiss, Ruihan Wu, Kilian Q. Weinberger, Andrew Gordon Wilson

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs).

Gaussian Processes

On Fairness and Calibration

1 code implementation NeurIPS 2017 Geoff Pleiss, Manish Raghavan, Felix Wu, Jon Kleinberg, Kilian Q. Weinberger

The machine learning community has become increasingly concerned with the potential for bias and discrimination in predictive models.

Fairness General Classification

Memory-Efficient Implementation of DenseNets

5 code implementations21 Jul 2017 Geoff Pleiss, Danlu Chen, Gao Huang, Tongcheng Li, Laurens van der Maaten, Kilian Q. Weinberger

A 264-layer DenseNet (73M parameters), which previously would have been infeasible to train, can now be trained on a single workstation with 8 NVIDIA Tesla M40 GPUs.

On Calibration of Modern Neural Networks

16 code implementations ICML 2017 Chuan Guo, Geoff Pleiss, Yu Sun, Kilian Q. Weinberger

Confidence calibration -- the problem of predicting probability estimates representative of the true correctness likelihood -- is important for classification models in many applications.

Document Classification General Classification

Snapshot Ensembles: Train 1, get M for free

8 code implementations1 Apr 2017 Gao Huang, Yixuan Li, Geoff Pleiss, Zhuang Liu, John E. Hopcroft, Kilian Q. Weinberger

In this paper, we propose a method to obtain the seemingly contradictory goal of ensembling multiple neural networks at no additional training cost.

Supervised Word Mover's Distance

1 code implementation NeurIPS 2016 Gao Huang, Chuan Guo, Matt J. Kusner, Yu Sun, Fei Sha, Kilian Q. Weinberger

Accurately measuring the similarity between text documents lies at the core of many real world applications of machine learning.

Document Classification General Classification +1

Densely Connected Convolutional Networks

127 code implementations CVPR 2017 Gao Huang, Zhuang Liu, Laurens van der Maaten, Kilian Q. Weinberger

Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output.

Breast Tumour Classification Crowd Counting +5

Private Causal Inference

no code implementations17 Dec 2015 Matt J. Kusner, Yu Sun, Karthik Sridharan, Kilian Q. Weinberger

Causal inference has the potential to have significant impact on medical research, prevention and control of diseases, and identifying factors that impact economic changes to name just a few.

Causal Inference

Bayesian Active Model Selection with an Application to Automated Audiometry

no code implementations NeurIPS 2015 Jacob Gardner, Gustavo Malkomes, Roman Garnett, Kilian Q. Weinberger, Dennis Barbour, John P. Cunningham

Using this and a previously published model for healthy responses, the proposed method is shown to be capable of diagnosing the presence or absence of NIHL with drastically fewer samples than existing approaches.

Model Selection

Deep Manifold Traversal: Changing Labels with Convolutional Features

no code implementations19 Nov 2015 Jacob R. Gardner, Paul Upchurch, Matt J. Kusner, Yixuan Li, Kilian Q. Weinberger, Kavita Bala, John E. Hopcroft

Many tasks in computer vision can be cast as a "label changing" problem, where the goal is to make a semantic change to the appearance of an image or some subject in an image in order to alter the class membership.

Compressing Convolutional Neural Networks

no code implementations14 Jun 2015 Wenlin Chen, James T. Wilson, Stephen Tyree, Kilian Q. Weinberger, Yixin Chen

Convolutional neural networks (CNN) are increasingly used in many areas of computer vision.

Compressing Neural Networks with the Hashing Trick

2 code implementations19 Apr 2015 Wenlin Chen, James T. Wilson, Stephen Tyree, Kilian Q. Weinberger, Yixin Chen

As deep nets are increasingly used in applications suited for mobile devices, a fundamental dilemma becomes apparent: the trend in deep learning is to grow models to absorb ever-increasing data set sizes; however mobile devices are designed with very little memory and cannot store such large models.

Compressed Support Vector Machines

no code implementations26 Jan 2015 Zhixiang Xu, Jacob R. Gardner, Stephen Tyree, Kilian Q. Weinberger

For most of the time during which we conducted this research, we were unaware of this prior work.

Differentially Private Bayesian Optimization

no code implementations16 Jan 2015 Matt J. Kusner, Jacob R. Gardner, Roman Garnett, Kilian Q. Weinberger

The success of machine learning has led practitioners in diverse real-world settings to learn classifiers for practical problems.

Image Data Compression for Covariance and Histogram Descriptors

no code implementations4 Dec 2014 Matt J. Kusner, Nicholas I. Kolkin, Stephen Tyree, Kilian Q. Weinberger

Specifically, we show that we can reduce data sets to 16% and in some cases as little as 2% of their original size, while approximately matching the test error of kNN classification on the full training set.

Data Compression General Classification

Parallel Support Vector Machines in Practice

no code implementations3 Apr 2014 Stephen Tyree, Jacob R. Gardner, Kilian Q. Weinberger, Kunal Agrawal, John Tran

In particular, we provide the first comparison of algorithms with explicit and implicit parallelization.

Non-linear Metric Learning

no code implementations NeurIPS 2012 Dor Kedem, Stephen Tyree, Fei Sha, Gert R. Lanckriet, Kilian Q. Weinberger

On various benchmark data sets, we demonstrate these methods not only match the current state-of-the-art in terms of kNN classification error, but in the case of χ2-LMNN, obtain best results in 19 out of 20 learning settings.

Metric Learning

Cost-Sensitive Tree of Classifiers

no code implementations9 Oct 2012 Zhixiang Xu, Matt J. Kusner, Kilian Q. Weinberger, Minmin Chen

Recently, machine learning algorithms have successfully entered large-scale real-world industrial applications (e. g. search engines and email spam filters).

Co-Training for Domain Adaptation

no code implementations NeurIPS 2011 Minmin Chen, Kilian Q. Weinberger, John Blitzer

Our algorithm is a variant of co-training, and we name it CODA (Co-training for domain adaptation).

Domain Adaptation

Large Margin Multi-Task Metric Learning

no code implementations NeurIPS 2010 Shibin Parameswaran, Kilian Q. Weinberger

Multi-task learning (MTL) improves the prediction performance on multiple, different but related, learning problems through shared parameters or representations.

Metric Learning Multi-Task Learning

Large Margin Taxonomy Embedding for Document Categorization

no code implementations NeurIPS 2008 Kilian Q. Weinberger, Olivier Chapelle

The optimization of the semantic space incorporates large margin constraints that ensure that for each instance the correct class prototype is closer than any other.

General Classification Multi-class Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.