1 code implementation • 15 Jan 2024 • Jeff Calder, Nadejda Drenska
In this paper we give a broad overview of the intersection of partial differential equations (PDEs) and graph-based semi-supervised learning.
1 code implementation • 19 Jul 2023 • James Chapman, Bohan Chen, Zheng Tan, Jeff Calder, Kevin Miller, Andrea L. Bertozzi
Active learning improves the performance of machine learning methods by judiciously selecting a limited number of unlabeled data points to query for labels, with the aim of maximally improving the underlying classifier's performance.
1 code implementation • 27 Oct 2022 • Kevin Miller, Jeff Calder
We show that uncertainty sampling is sufficient to achieve exploration versus exploitation in graph-based active learning, as long as the measure of uncertainty properly aligns with the underlying model and the model properly reflects uncertainty in unexplored regions.
no code implementations • 6 Sep 2022 • Jeff Calder, Reed Coil, Annie Melton, Peter J. Olver, Gilbert Tostevin, Katrina Yezzi-Woodley
Machine learning (ML), being now widely accessible to the research community at large, has fostered a proliferation of new and striking applications of these emergent mathematical techniques across a wide range of disciplines.
1 code implementation • 20 May 2022 • Katrina Yezzi-Woodley, Alexander Terwilliger, Jiafeng Li, Eric Chen, Martha Tappen, Jeff Calder, Peter J. Olver
Distinguishing agents of bone modification at paleoanthropological sites is at the root of much of the research directed at understanding early hominin exploitation of large animal resources and the effects those subsistence behaviors had on early hominin evolution.
1 code implementation • 5 May 2022 • Katrina Yezzi-Woodley, Jeff Calder, Mckenzie Sweno, Chloe Siewert, Peter J. Olver
Within anthropology, the use of three-dimensional (3D) imaging has become increasingly standard and widespread since it broadens the available avenues for addressing a wide range of key issues.
1 code implementation • 31 Mar 2022 • Kevin Miller, John Mauro, Jason Setiadi, Xoaquin Baca, Zhan Shi, Jeff Calder, Andrea L. Bertozzi
We use a Convolutional Neural Network Variational Autoencoder (CNNVAE) to embed SAR data into a feature space, and then construct a similarity graph from the embedded data and apply graph-based semi-supervised learning techniques.
1 code implementation • 17 Feb 2022 • Jeff Calder, Mahmood Ettehad
We show that the $p$-eikonal equation with $p=1$ is a provably robust distance-type function on a graph, and the $p\to \infty$ limit recovers shortest path distances.
1 code implementation • 24 Nov 2021 • Leon Bungert, Jeff Calder, Tim Roith
In this work we prove uniform convergence rates for solutions of the graph infinity Laplace equation as the number of vertices grows to infinity.
1 code implementation • 5 Nov 2021 • Jeff Calder, Sangmin Park, Dejan Slepčev
We introduce new estimators for the normal vector to the boundary, distance of a point to the boundary, and a test for whether a point lies within a boundary strip.
no code implementations • 10 Nov 2020 • Katrina Yezzi-Woodley, Jeff Calder, Peter J. Olver, Annie Melton, Paige Cody, Thomas Huffstutler, Alexander Terwilliger, Martha Tappen, Reed Coil, Gilbert Tostevin
The contact goniometer is a commonly used tool in lithic and zooarchaeological analysis, despite suffering from a number of shortcomings due to the physical interaction between the measuring implement, the object being measured, and the individual taking the measurements.
no code implementations • 31 Aug 2020 • Jeff Calder, Nadejda Drenska
The prediction problem is played (in part) over a discrete graph called the $d$ dimensional de Bruijn graph, where $d$ is the number of days of history used by the experts.
no code implementations • 31 Jul 2020 • Nadejda Drenska, Jeff Calder
We consider the problem with history-dependent experts, in which each expert uses the previous $d$ days of history of the market in making their predictions.
no code implementations • 13 Jul 2020 • Jeff Calder, Nicolas Garcia Trillos, Marta Lewicka
As a byproduct of our general regularity results, we obtain high probability $L^\infty$ and approximate $\mathcal{C}^{0, 1}$ convergence rates for the convergence of graph Laplacian eigenvectors towards eigenfunctions of the corresponding weighted Laplace-Beltrami operators.
1 code implementation • ICML 2020 • Jeff Calder, Brendan Cook, Matthew Thorpe, Dejan Slepcev
We propose a new framework, called Poisson learning, for graph based semi-supervised learning at very low label rates.
no code implementations • 4 Jun 2020 • Jeff Calder, Dejan Slepčev, Matthew Thorpe
The proofs of our well-posedness results use the random walk interpretation of Laplacian learning and PDE arguments, while the proofs of the ill-posedness results use $\Gamma$-convergence tools from the calculus of variations.
1 code implementation • 24 Jan 2020 • Amber Yuan, Jeff Calder, Braxton Osting
In this paper, we propose a new framework for rigorously studying continuum limits of learning algorithms on directed graphs.
no code implementations • 29 Oct 2019 • Jeff Calder, Nicolas Garcia Trillos
In this paper we improve the spectral convergence rates for graph-based approximations of Laplace-Beltrami operators constructed from random data.
1 code implementation • 6 May 2019 • Riley O'Neill, Pedro Angulo-Umana, Jeff Calder, Bo Hessburg, Peter J. Olver, Chehrzad Shakiban, Katrina Yezzi-Woodley
We show how to compute the circular area invariant of planar curves, and the spherical volume invariant of surfaces, in terms of line and surface integrals, respectively.
no code implementations • ICLR 2019 • Adam M. Oberman, Jeff Calder
We show that if the usual training loss is augmented by a Lipschitz regularization term, then the networks generalize.
no code implementations • 15 Jan 2019 • Mauricio Flores, Jeff Calder, Gilad Lerman
In the first part of the paper we prove new discrete to continuum convergence results for $p$-Laplace problems on $k$-nearest neighbor ($k$-NN) graphs, which are more commonly used in practice than random geometric graphs.
no code implementations • 10 Oct 2018 • Jeff Calder, Dejan Slepcev
The performance of traditional graph Laplacian methods for semi-supervised learning degrades substantially as the ratio of labeled to unlabeled data decreases, due to a degeneracy in the graph Laplacian.
1 code implementation • 2 Oct 2018 • Jeff Calder, Anthony Yezzi
This paper provides a rigorous convergence rate and complexity analysis for a recently introduced framework, called PDE acceleration, for solving problems in the calculus of variations, and explores applications to obstacle problems.
Numerical Analysis Numerical Analysis Analysis of PDEs Dynamical Systems Optimization and Control 65M06, 35Q93, 65K10, 49K20
no code implementations • 28 Aug 2018 • Chris Finlay, Jeff Calder, Bilal Abbasi, Adam Oberman
In this work we study input gradient regularization of deep neural networks, and demonstrate that such regularization leads to generalization proofs and improved adversarial robustness.
no code implementations • 28 Nov 2017 • Jeff Calder
We study the game theoretic p-Laplacian for semi-supervised learning on graphs, and show that it is well-posed in the limit of finite labeled data and infinite unlabeled data.
no code implementations • 28 Oct 2017 • Jeff Calder
We study the consistency of Lipschitz learning on graphs in the limit of infinite unlabeled data and finite labeled data.
no code implementations • 15 Aug 2016 • Bilal Abbasi, Jeff Calder, Adam M. Oberman
We propose in this paper a fast real-time streaming version of the PDA algorithm for anomaly detection that exploits the computational advantages of PDE continuum limits.
no code implementations • 20 Aug 2015 • Ko-Jen Hsiao, Kevin S. Xu, Jeff Calder, Alfred O. Hero III
If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination.
no code implementations • 21 Feb 2014 • Ko-Jen Hsiao, Jeff Calder, Alfred O. Hero III
Most content-based image retrieval systems consider either one single query, or multiple queries that include the same object or represent the same semantic information.
no code implementations • NeurIPS 2012 • Ko-Jen Hsiao, Kevin Xu, Jeff Calder, Alfred O. Hero
In such a case, multiple criteria can be defined, and one can test for anomalies by scalarizing the multiple criteria by taking some linear combination of them.