no code implementations • 8 Apr 2024 • Kui Qian, Litao Qiao, Beth Friedman, Edward O'Donnell, David Kleinfeld, Yoav Freund
It is based on the automated recognition of interpretable features obtained by analyzing the shapes of cells.
no code implementations • 27 Jun 2023 • Robi Bhattacharjee, Alexander Cloninger, Yoav Freund, Andreas Oslandsbotn
One attractive application of ER is to point clouds, i. e. graphs whose vertices correspond to IID samples from a distribution over a metric space.
no code implementations • 5 Mar 2023 • Sanjoy Dasgupta, Yoav Freund
We present a general-purpose active learning scheme for data in metric spaces.
no code implementations • 28 Feb 2022 • Robi Bhattacharjee, Alex Cloninger, Yoav Freund, Andreas Oslandsbotn
Effective resistance (ER) is an attractive way to interrogate the structure of graphs.
no code implementations • 5 Oct 2021 • Yoav Freund, Yi-An Ma, Tong Zhang
There has been a surge of works bridging MCMC sampling and optimization, with a specific focus on translating non-asymptotic convergence guarantees for optimization problems into the analysis of Langevin algorithms in MCMC sampling.
no code implementations • 23 Aug 2021 • Andreas Oslandsbotn, Zeljko Kereta, Valeriya Naumova, Yoav Freund, Alexander Cloninger
With a novel sub-sampling scheme, StreaMRAK reduces memory and computational complexities by creating a sketch of the original data, where the sub-sampling density is adapted to the bandwidth of the kernel and the local dimensionality of the data.
no code implementations • 20 Jun 2021 • Yoav Freund
We study a family of potential functions for online learning.
no code implementations • 2 Mar 2021 • Peter Gerstoft, Yihan Hu, Michael J. Bianco, Chaitanya Patil, Ardel Alegre, Yoav Freund, Francois Grondin
The DOAs are fed to a fusion center, concatenated, and used to perform the localization based on two proposed methods, which require only few labeled source locations (anchor points) for training.
no code implementations • 15 Jul 2020 • Julaiti Alafate, Yoav Freund, David T. Sandwell, Brook Tozer
We describe an application of machine learning to a real-world computer assisted labeling task.
1 code implementation • NeurIPS 2019 • Akshay Balsubramani, Sanjoy Dasgupta, Yoav Freund, Shay Moran
We introduce a variant of the $k$-nearest neighbor classifier in which $k$ is chosen adaptively for each query, rather than supplied as a parameter.
2 code implementations • NeurIPS 2019 • Julaiti Alafate, Yoav Freund
State-of-the-art implementations of boosting, such as XGBoost and LightGBM, can process large training sets extremely fast.
no code implementations • 19 May 2018 • Julaiti Alafate, Yoav Freund
We present a novel approach for parallel computation in the context of machine learning that we call "Tell Me Something New" (TMSN).
no code implementations • 9 Mar 2018 • Yuncong Chen, David Kleinfeld, Martyn Goulding, Yoav Freund
In this work we describe the first steps in developing a semi-automated system to construct a histology at- las of mouse brainstem that combines atlas-guided annotation, landmark-based registration and atlas generation in an iterative framework.
no code implementations • 28 Feb 2017 • Yuncong Chen, Lauren McElvain, Alex Tolpygo, Daniel Ferrante, Harvey Karten, Partha Mitra, David Kleinfeld, Yoav Freund
We have developed a digital atlas methodology that combines information about the 3D organization of the brain and the detailed texture of neurons in different structures.
1 code implementation • 28 May 2016 • Akshay Balsubramani, Yoav Freund
We explore a novel approach to semi-supervised learning.
no code implementations • 12 Apr 2016 • Raef Bassily, Yoav Freund
We show that typical stability can control generalization error in adaptive data analysis even when the samples in the dataset are not necessarily independent and when queries to be computed are not necessarily of bounded-sensitivity as long as the results of the queries over the dataset (i. e., the computed statistics) follow a distribution with a "light" tail.
1 code implementation • NeurIPS 2016 • Akshay Balsubramani, Yoav Freund
We address the problem of aggregating an ensemble of predictors with known loss bounds in a semi-supervised binary classification setting, to minimize prediction loss incurred on the unlabeled data.
1 code implementation • NeurIPS 2015 • Akshay Balsubramani, Yoav Freund
We present and empirically evaluate an efficient algorithm that learns to aggregate the predictions of an ensemble of binary classifiers.
1 code implementation • 5 Mar 2015 • Akshay Balsubramani, Yoav Freund
We develop a worst-case analysis of aggregation of classifier ensembles for binary classification.
no code implementations • NeurIPS 2013 • Akshay Balsubramani, Sanjoy Dasgupta, Yoav Freund
We consider a situation in which we see samples in $\mathbb{R}^d$ drawn i. i. d.
no code implementations • 15 Jan 2015 • Akshay Balsubramani, Yoav Freund
We consider using an ensemble of binary classifiers for transductive prediction, when unlabeled test data are known in advance.
no code implementations • 9 Sep 2014 • Sunsern Cheamanunkul, Evan Ettinger, Yoav Freund
The sensitivity of Adaboost to random label noise is a well-studied problem.
no code implementations • 9 Sep 2014 • Sunsern Cheamanunkul, Yoav Freund
We found that both machine adaptation and human adaptation have significant impact on the input rate and must be considered together in order to improve the efficiency of the system as a whole.
no code implementations • NeurIPS 2009 • Kamalika Chaudhuri, Yoav Freund, Daniel J. Hsu
Previous algorithms for learning in this framework have a tunable learning rate parameter, and a major barrier to using online-learning in practical applications is that it is not understood how to set this parameter optimally, particularly when the number of actions is large.
no code implementations • NeurIPS 2007 • Yoav Freund, Sanjoy Dasgupta, Mayank Kabra, Nakul Verma
We present a simple variant of the k-d tree which automatically adapts to intrinsic low dimensional structure in data.