1 code implementation • 25 Dec 2024 • Yanna Ding, Zijie Huang, Malik Magdon-Ismail, Jianxi Gao
To address these gaps, we propose a novel framework for learning network dynamics directly from observed time-series data, when prior knowledge of graph topology or governing dynamical equations is absent.
no code implementations • 3 Sep 2023 • Yun Lu, Malik Magdon-Ismail, Yu Wei, Vassilis Zikas
Differential Privacy (DP) (and its variants) is the most common method for machine learning (ML) on privacy-sensitive data.
no code implementations • 12 May 2023 • Alex Gittens, Malik Magdon-Ismail
Open question: Can label complexity be reduced by $\Omega(n)$ with tight $(1+d/n)$-approximation?
no code implementations • 25 Aug 2021 • Georgios Mavroudeas, Guillaume Baudart, Alan Cha, Martin Hirzel, Jim A. Laredo, Malik Magdon-Ismail, Louis Mandel, Erik Wittern
GraphQL is a query language for APIs and a runtime for executing those queries, fetching the requested data from existing microservices, REST APIs, databases, or other sources.
1 code implementation • 16 Apr 2021 • Dong Hu, Alex Gittens, Malik Magdon-Ismail
Specifically, we consider that it is possible to obtain low noise, high cost observations of individual entries or high noise, low cost observations of entire columns.
no code implementations • 1 Sep 2020 • Prasanna Date, Christopher D. Carothers, John E. Mitchell, James A. Hendler, Malik Magdon-Ismail
We believe that deep neural networks (DNNs), where learning parameters are constrained to have a set of finite discrete values, running on neuromorphic computing systems would be instrumental for intelligent edge computing systems having these desirable characteristics.
no code implementations • 24 Aug 2020 • Liam Dowling Jones, Malik Magdon-Ismail, Laura Mersini-Houghton, Steven Meshnick
We present a new mathematical model to explicitly capture the effects that the three restriction measures: the lockdown date and duration, social distancing and masks, and, schools and border closing, have in controlling the spread of COVID-19 infections $i(r, t)$.
1 code implementation • 17 Mar 2020 • Malik Magdon-Ismail
We present a robust data-driven machine learning analysis of the COVID-19 pandemic from its early infection dynamics, specifically infection counts over time.
no code implementations • 27 Sep 2019 • Malik Magdon-Ismail, Alex Gittens
We give a fast oblivious L2-embedding of $A\in \mathbb{R}^{n x d}$ to $B\in \mathbb{R}^{r x d}$ satisfying $(1-\varepsilon)\|A x\|_2^2 \le \|B x\|_2^2 <= (1+\varepsilon) \|Ax\|_2^2.$ Our embedding dimension $r$ equals $d$, a constant independent of the distortion $\varepsilon$.
1 code implementation • 21 Feb 2019 • Aritra Chowdhury, Malik Magdon-Ismail, Bulent Yener
The agnostic and naive methodologies quantify the error contribution and propagation respectively from the computational steps, algorithms and hyperparameters in the image classification pipeline.
no code implementations • 23 Jan 2019 • Maksim Tsikhanovich, Malik Magdon-Ismail, Muhammad Ishaq, Vassilis Zikas
We apply our methodology to two major ML algorithms, namely non-negative matrix factorization (NMF) and singular value decomposition (SVD).
no code implementations • 15 Jan 2019 • Malik Magdon-Ismail, Kshiteesh Hegde
We define the intrinsic scale at which a network begins to reveal its identity as the scale at which subgraphs in the network (created by a random walk) are distinguishable from similar sized subgraphs in a perturbed copy of the network.
no code implementations • 15 Jan 2019 • Kshiteesh Hegde, Malik Magdon-Ismail
We study the problem of identifying different behaviors occurring in different parts of a large heterogenous network.
no code implementations • 6 May 2018 • Stephen Notley, Malik Magdon-Ismail
In this study, we use neural networks to extract features from both images and numeric data and use these extracted features as inputs for other machine learning models, namely support vector machines (SVMs) and k-nearest neighbor classifiers (KNNs), in order to see if neural-network-extracted features enhance the capabilities of these models.
no code implementations • ICLR 2018 • Kshiteesh Hegde, Malik Magdon-Ismail, Ram Ramanathan, Bishal Thapa
We propose a novel subgraph image representation for classification of network fragments with the targets being their parent networks.
no code implementations • NeurIPS 2016 • Malik Magdon-Ismail, Christos Boutsidis
Principal components analysis~(PCA) is the optimal linear encoder of data.
no code implementations • 19 Feb 2016 • Ke Wu, Malik Magdon-Ismail
Multilayer networks have seen a resurgence under the umbrella of deep learning.
no code implementations • NeurIPS 2015 • Abhisek Kundu, Petros Drineas, Malik Magdon-Ismail
We show that for a wide class of optimization problems, if the sketch is close (in the spectral norm) to the original data matrix, then one can recover a near optimal solution to the optimization problem by using the sketch.
no code implementations • 2 Mar 2015 • Abhisek Kundu, Petros Drineas, Malik Magdon-Ismail
This paper addresses how well we can recover a data matrix when only given a few of its elements.
no code implementations • 23 Feb 2015 • Malik Magdon-Ismail, Christos Boutsidis
Principal components analysis (PCA) is the optimal linear auto-encoder of data, and it is often used to construct features.
no code implementations • 19 Feb 2015 • Malik Magdon-Ismail
We give a reduction from {\sc clique} to establish that sparse PCA is NP-hard.
no code implementations • 1 Jun 2014 • Saurabh Paul, Malik Magdon-Ismail, Petros Drineas
In the unsupervised setting, we also provide worst-case guarantees of the radius of the minimum enclosing ball, thereby ensuring comparable generalization as in the full feature space and resolving an open problem posed in Dasgupta et al. We present extensive experiments on real-world datasets to support our theory and to demonstrate that our method is competitive and often better than prior state-of-the-art, for which there are no known provable guarantees.
no code implementations • 26 Nov 2012 • Saurabh Paul, Christos Boutsidis, Malik Magdon-Ismail, Petros Drineas
Let X be a data matrix of rank \rho, whose rows represent n points in d-dimensional space.
no code implementations • 19 Jul 2012 • Kenneth L. Clarkson, Petros Drineas, Malik Magdon-Ismail, Michael W. Mahoney, Xiangrui Meng, David P. Woodruff
We provide fast algorithms for overconstrained $\ell_p$ regression and related problems: for an $n\times d$ input matrix $A$ and vector $b\in\mathbb{R}^n$, in $O(nd\log n)$ time we reduce the problem $\min_{x\in\mathbb{R}^d} \|Ax-b\|_p$ to the same problem with input matrix $\tilde A$ of dimension $s \times d$ and corresponding $\tilde b$ of dimension $s\times 1$.
no code implementations • 16 Feb 2012 • Christos Boutsidis, Petros Drineas, Malik Magdon-Ismail
We study (constrained) least-squares regression as well as multiple response least-squares regression and ask the question of whether a subset of the data, a coreset, suffices to compute a good approximate solution to the regression.
no code implementations • NeurIPS 2011 • Christos Boutsidis, Petros Drineas, Malik Magdon-Ismail
Principal Components Analysis~(PCA) is often used as a feature extraction procedure.
no code implementations • 26 Sep 2011 • Christos Boutsidis, Malik Magdon-Ismail
We study feature selection for $k$-means clustering.
no code implementations • NeurIPS 2010 • Malik Magdon-Ismail
We define a data dependent permutation complexity for a hypothesis set \math{\hset}, which is similar to a Rademacher complexity or maximum discrepancy.
no code implementations • NeurIPS 2008 • Sanmay Das, Malik Magdon-Ismail
We study the profit-maximization problem of a monopolistic market-maker who sets two-sided prices in an asset market.