Search Results for author: Malik Magdon-Ismail

Found 27 papers, 4 papers with code

Learning GraphQL Query Costs (Extended Version)

no code implementations25 Aug 2021 Georgios Mavroudeas, Guillaume Baudart, Alan Cha, Martin Hirzel, Jim A. Laredo, Malik Magdon-Ismail, Louis Mandel, Erik Wittern

GraphQL is a query language for APIs and a runtime for executing those queries, fetching the requested data from existing microservices, REST APIs, databases, or other sources.

NoisyCUR: An algorithm for two-cost budgeted matrix completion

1 code implementation16 Apr 2021 Dong Hu, Alex Gittens, Malik Magdon-Ismail

Specifically, we consider that it is possible to obtain low noise, high cost observations of individual entries or high noise, low cost observations of entire columns.

Matrix Completion

Training Deep Neural Networks with Constrained Learning Parameters

no code implementations1 Sep 2020 Prasanna Date, Christopher D. Carothers, John E. Mitchell, James A. Hendler, Malik Magdon-Ismail

We believe that deep neural networks (DNNs), where learning parameters are constrained to have a set of finite discrete values, running on neuromorphic computing systems would be instrumental for intelligent edge computing systems having these desirable characteristics.

Edge-computing

A New Mathematical Model for Controlled Pandemics Like COVID-19 : AI Implemented Predictions

1 code implementation24 Aug 2020 Liam Dowling Jones, Malik Magdon-Ismail, Laura Mersini-Houghton, Steven Meshnick

We present a new mathematical model to explicitly capture the effects that the three restriction measures: the lockdown date and duration, social distancing and masks, and, schools and border closing, have in controlling the spread of COVID-19 infections $i(r, t)$.

Machine Learning the Phenomenology of COVID-19 From Early Infection Dynamics

1 code implementation17 Mar 2020 Malik Magdon-Ismail

We present a robust data-driven machine learning analysis of the COVID-19 pandemic from its early infection dynamics, specifically infection counts over time.

Fast Fixed Dimension L2-Subspace Embeddings of Arbitrary Accuracy, With Application to L1 and L2 Tasks

no code implementations27 Sep 2019 Malik Magdon-Ismail, Alex Gittens

We give a fast oblivious L2-embedding of $A\in \mathbb{R}^{n x d}$ to $B\in \mathbb{R}^{r x d}$ satisfying $(1-\varepsilon)\|A x\|_2^2 \le \|B x\|_2^2 <= (1+\varepsilon) \|Ax\|_2^2.$ Our embedding dimension $r$ equals $d$, a constant independent of the distortion $\varepsilon$.

Quantifying contribution and propagation of error from computational steps, algorithms and hyperparameter choices in image classification pipelines

1 code implementation21 Feb 2019 Aritra Chowdhury, Malik Magdon-Ismail, Bulent Yener

The agnostic and naive methodologies quantify the error contribution and propagation respectively from the computational steps, algorithms and hyperparameters in the image classification pipeline.

General Classification Hyperparameter Optimization +1

PD-ML-Lite: Private Distributed Machine Learning from Lighweight Cryptography

no code implementations23 Jan 2019 Maksim Tsikhanovich, Malik Magdon-Ismail, Muhammad Ishaq, Vassilis Zikas

We apply our methodology to two major ML algorithms, namely non-negative matrix factorization (NMF) and singular value decomposition (SVD).

Recommendation Systems

Network Lens: Node Classification in Topologically Heterogeneous Networks

no code implementations15 Jan 2019 Kshiteesh Hegde, Malik Magdon-Ismail

We study the problem of identifying different behaviors occurring in different parts of a large heterogenous network.

Classification General Classification +1

The Intrinsic Scale of Networks is Small

no code implementations15 Jan 2019 Malik Magdon-Ismail, Kshiteesh Hegde

We define the intrinsic scale at which a network begins to reveal its identity as the scale at which subgraphs in the network (created by a random walk) are distinguishable from similar sized subgraphs in a perturbed copy of the network.

Examining the Use of Neural Networks for Feature Extraction: A Comparative Analysis using Deep Learning, Support Vector Machines, and K-Nearest Neighbor Classifiers

no code implementations6 May 2018 Stephen Notley, Malik Magdon-Ismail

In this study, we use neural networks to extract features from both images and numeric data and use these extracted features as inputs for other machine learning models, namely support vector machines (SVMs) and k-nearest neighbor classifiers (KNNs), in order to see if neural-network-extracted features enhance the capabilities of these models.

General Classification

Node-By-Node Greedy Deep Learning for Interpretable Features

no code implementations19 Feb 2016 Ke Wu, Malik Magdon-Ismail

Multilayer networks have seen a resurgence under the umbrella of deep learning.

Approximating Sparse PCA from Incomplete Data

no code implementations NeurIPS 2015 Abhisek Kundu, Petros Drineas, Malik Magdon-Ismail

We show that for a wide class of optimization problems, if the sketch is close (in the spectral norm) to the original data matrix, then one can recover a near optimal solution to the optimization problem by using the sketch.

Recovering PCA from Hybrid-$(\ell_1,\ell_2)$ Sparse Sampling of Data Elements

no code implementations2 Mar 2015 Abhisek Kundu, Petros Drineas, Malik Magdon-Ismail

This paper addresses how well we can recover a data matrix when only given a few of its elements.

Optimal Sparse Linear Auto-Encoders and Sparse PCA

no code implementations23 Feb 2015 Malik Magdon-Ismail, Christos Boutsidis

Principal components analysis (PCA) is the optimal linear auto-encoder of data, and it is often used to construct features.

NP-Hardness and Inapproximability of Sparse PCA

no code implementations19 Feb 2015 Malik Magdon-Ismail

We give a reduction from {\sc clique} to establish that sparse PCA is NP-hard.

Feature Selection for Linear SVM with Provable Guarantees

no code implementations1 Jun 2014 Saurabh Paul, Malik Magdon-Ismail, Petros Drineas

In the unsupervised setting, we also provide worst-case guarantees of the radius of the minimum enclosing ball, thereby ensuring comparable generalization as in the full feature space and resolving an open problem posed in Dasgupta et al. We present extensive experiments on real-world datasets to support our theory and to demonstrate that our method is competitive and often better than prior state-of-the-art, for which there are no known provable guarantees.

feature selection

The Fast Cauchy Transform and Faster Robust Linear Regression

no code implementations19 Jul 2012 Kenneth L. Clarkson, Petros Drineas, Malik Magdon-Ismail, Michael W. Mahoney, Xiangrui Meng, David P. Woodruff

We provide fast algorithms for overconstrained $\ell_p$ regression and related problems: for an $n\times d$ input matrix $A$ and vector $b\in\mathbb{R}^n$, in $O(nd\log n)$ time we reduce the problem $\min_{x\in\mathbb{R}^d} \|Ax-b\|_p$ to the same problem with input matrix $\tilde A$ of dimension $s \times d$ and corresponding $\tilde b$ of dimension $s\times 1$.

Near-optimal Coresets For Least-Squares Regression

no code implementations16 Feb 2012 Christos Boutsidis, Petros Drineas, Malik Magdon-Ismail

We study (constrained) least-squares regression as well as multiple response least-squares regression and ask the question of whether a subset of the data, a coreset, suffices to compute a good approximate solution to the regression.

Permutation Complexity Bound on Out-Sample Error

no code implementations NeurIPS 2010 Malik Magdon-Ismail

We define a data dependent permutation complexity for a hypothesis set \math{\hset}, which is similar to a Rademacher complexity or maximum discrepancy.

Adapting to a Market Shock: Optimal Sequential Market-Making

no code implementations NeurIPS 2008 Sanmay Das, Malik Magdon-Ismail

We study the profit-maximization problem of a monopolistic market-maker who sets two-sided prices in an asset market.

Cannot find the paper you are looking for? You can Submit a new open access paper.