no code implementations • 8 Nov 2021 • Matthew Nokleby, Ahmad Beirami
For models that are (roughly) lower Lipschitz in their parameters, we bound the rate distortion function from below, whereas for VC classes, the mutual information is bounded above by $d_\mathrm{vc}\log(n)$.
no code implementations • ICLR 2019 • Nuwan Ferdinand, Haider Al-Lawati, Stark C. Draper, Matthew Nokleby
Anytime Minibatch prevents stragglers from holding up the system without wasting the work that stragglers can complete.
no code implementations • 18 May 2020 • Matthew Nokleby, Haroon Raja, Waheed U. Bajwa
This paper reviews recently developed methods that focus on large-scale distributed stochastic optimization in the compute- and bandwidth-limited regime, with an emphasis on convergence analysis that explicitly accounts for the mismatch between computation, communication and streaming rates.
no code implementations • 15 Apr 2020 • Luisa F. Polania, Mauricio Flores, Yiran Li, Matthew Nokleby
We present two GNN models, both of which comprise a deep CNN that extracts a feature representation for each image, a gated recurrent unit (GRU) network that models interactions between the furniture items in a set, and an aggregation function that calculates the compatibility score.
no code implementations • NAACL 2019 • Ishan Jindal, Daniel Pressel, Brian Lester, Matthew Nokleby
In this paper, we propose an approach to training deep networks that is robust to label noise.
no code implementations • 11 Nov 2018 • Ishan Jindal, Zhiwei Qin, Xue-wen Chen, Matthew Nokleby, Jieping Ye
In this paper, we develop a reinforcement learning (RL) based system to learn an effective policy for carpooling that maximizes transportation efficiency so that fewer cars are required to fulfill the given amount of trip demand.
no code implementations • 26 Oct 2018 • Parinaz Farajiparvar, Ahmad Beirami, Matthew Nokleby
We consider this problem for unsupervised learning for batch and sequential data.
no code implementations • 25 Oct 2018 • Ishan Jindal, Matthew Nokleby
We consider the problem of detecting whether a tensor signal having many missing entities lies within a given low dimensional Kronecker-Structured (KS) subspace.
no code implementations • 12 Oct 2017 • Ishan Jindal, Tony, Qin, Xue-wen Chen, Matthew Nokleby, Jieping Ye
In building intelligent transportation systems such as taxi or rideshare services, accurate prediction of travel time and distance is crucial for customer experience and resource management.
no code implementations • 9 May 2017 • Ishan Jindal, Matthew Nokleby, Xue-wen Chen
Large datasets often have unreliable labels-such as those obtained from Amazon's Mechanical Turk or social media platforms-and classifiers trained on mislabeled datasets often exhibit poor performance.
no code implementations • 7 May 2017 • Ishan Jindal, Matthew Nokleby
We study the classification performance of Kronecker-structured models in two asymptotic regimes and developed an algorithm for separable, fast and compact K-S dictionary learning for better classification and representation of multidimensional signals by exploiting the structure in the signal.
no code implementations • 25 Apr 2017 • Matthew Nokleby, Waheed U. Bajwa
Motivated by machine learning applications in networks of sensors, internet-of-things (IoT) devices, and autonomous agents, we propose techniques for distributed stochastic convex learning from high-rate data streams.
no code implementations • 8 May 2016 • Matthew Nokleby, Ahmad Beirami, Robert Calderbank
We provide lower and upper bounds on the rate-distortion function, using $L_p$ loss as the distortion measure, of a maximum a priori classifier in terms of the differential entropy of the posterior distribution and a quantity called the interpolation dimension, which characterizes the complexity of the parametric distribution family.