no code implementations • ICML 2020 • Corinna Cortes, Giulia Desalvo, Claudio Gentile, Mehryar Mohri, Ningshan Zhang
A general framework for online learning with partial information is one where feedback graphs specify which losses can be observed by the learner.
no code implementations • 25 Aug 2020 • Corinna Cortes, Mehryar Mohri, Ananda Theertha Suresh, Ningshan Zhang
We present a new discriminative technique for the multiple-source adaptation, MSA, problem.
no code implementations • ICML 2020 • Corinna Cortes, Giulia Desalvo, Claudio Gentile, Mehryar Mohri, Ningshan Zhang
We present a new active learning algorithm that adaptively partitions the input space into a finite number of regions, and subsequently seeks a distinct predictor for each region, both phases actively requesting labels.
no code implementations • NeurIPS 2019 • Ben Adlam, Corinna Cortes, Mehryar Mohri, Ningshan Zhang
Generative adversarial networks (GANs) generate data based on minimizing a divergence between two distributions.
no code implementations • 1 Jun 2018 • Ningshan Zhang, Kyle Schmaus, Patrick O. Perry
The main challenge in deploying this model is computational: the data sizes are large, and fitting the model at scale using off-the-shelf maximum likelihood procedures is prohibitive.
no code implementations • NeurIPS 2018 • Judy Hoffman, Mehryar Mohri, Ningshan Zhang
This work includes a number of novel contributions for the multiple-source adaptation problem.
no code implementations • 14 Nov 2017 • Judy Hoffman, Mehryar Mohri, Ningshan Zhang
We present a detailed theoretical analysis of the problem of multiple-source adaptation in the general stochastic scenario, extending known results that assume a single target labeling function.