Search Results for author: Ehsan Kazemi

Found 22 papers, 4 papers with code

Streaming Submodular Maximization under a k-Set System Constraint

no code implementations ICML 2020 Ran Haba, Ehsan Kazemi, Moran Feldman, Amin Karbasi

Moreover, we propose the first streaming algorithms for monotone submodular maximization subject to $k$-extendible and $k$-system constraints.

Data Summarization Movie Recommendation

Complementing Semi-Supervised Learning with Uncertainty Quantification

no code implementations22 Jul 2022 Ehsan Kazemi

To alleviate this problem semi-supervised learning (SSL) leverages the knowledge of the classifier on the labeled domain and extrapolates it to the unlabeled domain which has a supposedly similar distribution as annotated data.

CTIN: Robust Contextual Transformer Network for Inertial Navigation

1 code implementation3 Dec 2021 Bingbing Rao, Ehsan Kazemi, Yifan Ding, Devu M Shila, Frank M. Tucker, Liqiang Wang

Recently, data-driven inertial navigation approaches have demonstrated their capability of using well-trained neural networks to obtain accurate position estimates from inertial measurement units (IMU) measurements.

Multi-Task Learning

The Power of Subsampling in Submodular Maximization

no code implementations6 Apr 2021 Christopher Harshaw, Ehsan Kazemi, Moran Feldman, Amin Karbasi

We propose subsampling as a unified algorithmic technique for submodular maximization in centralized and online settings.

Movie Recommendation Video Summarization

Generating Structured Adversarial Attacks Using Frank-Wolfe Method

no code implementations15 Feb 2021 Ehsan Kazemi, Thomas Kerdreux, Liquang Wang

White box adversarial perturbations are generated via iterative optimization algorithms most often by minimizing an adversarial loss on a $\ell_p$ neighborhood of the original image, the so-called distortion set.

Adversarial Robustness

Trace-Norm Adversarial Examples

no code implementations2 Jul 2020 Ehsan Kazemi, Thomas Kerdreux, Liqiang Wang

White box adversarial perturbations are sought via iterative optimization algorithms most often minimizing an adversarial loss on a $l_p$ neighborhood of the original image, the so-called distortion set.

Adversarial Robustness

Submodular Maximization in Clean Linear Time

no code implementations16 Jun 2020 Wenxin Li, Moran Feldman, Ehsan Kazemi, Amin Karbasi

In this paper, we provide the first deterministic algorithm that achieves the tight $1-1/e$ approximation guarantee for submodular maximization under a cardinality (size) constraint while making a number of queries that scales only linearly with the size of the ground set $n$.

Movie Recommendation Text Summarization +1

Submodular Maximization Through Barrier Functions

no code implementations NeurIPS 2020 Ashwinkumar Badanidiyuru, Amin Karbasi, Ehsan Kazemi, Jan Vondrak

In this paper, we introduce a novel technique for constrained submodular maximization, inspired by barrier functions in continuous optimization.

Movie Recommendation

Regularized Submodular Maximization at Scale

no code implementations10 Feb 2020 Ehsan Kazemi, Shervin Minaee, Moran Feldman, Amin Karbasi

In this paper, we propose scalable methods for maximizing a regularized submodular function $f = g - \ell$ expressed as the difference between a monotone submodular function $g$ and a modular function $\ell$.

Data Summarization Point Processes +1

Streaming Submodular Maximization under a $k$-Set System Constraint

1 code implementation9 Feb 2020 Ran Haba, Ehsan Kazemi, Moran Feldman, Amin Karbasi

In this paper, we propose a novel framework that converts streaming algorithms for monotone submodular maximization into streaming algorithms for non-monotone submodular maximization.

Data Summarization Movie Recommendation

Nektar++: enhancing the capability and application of high-fidelity spectral/$hp$ element methods

no code implementations8 Jun 2019 David Moxey, Chris D. Cantwell, Yan Bao, Andrea Cassinelli, Giacomo Castiglioni, Sehun Chun, Emilia Juda, Ehsan Kazemi, Kilian Lackhove, Julian Marcon, Gianmarco Mengaldo, Douglas Serson, Michael Turner, Hui Xu, Joaquim Peiró, Robert M. Kirby, Spencer J. Sherwin

Nektar++ is an open-source framework that provides a flexible, high-performance and scalable platform for the development of solvers for partial differential equations using the high-order spectral/$hp$ element method.

Mathematical Software Numerical Analysis Numerical Analysis Fluid Dynamics

Submodular Streaming in All its Glory: Tight Approximation, Minimum Memory and Low Adaptive Complexity

no code implementations2 May 2019 Ehsan Kazemi, Marko Mitrovic, Morteza Zadimoghaddam, Silvio Lattanzi, Amin Karbasi

We show how one can achieve the tight $(1/2)$-approximation guarantee with $O(k)$ shared memory while minimizing not only the required rounds of computations but also the total number of communicated bits.

Data Summarization

Adaptive Sequence Submodularity

1 code implementation NeurIPS 2019 Marko Mitrovic, Ehsan Kazemi, Moran Feldman, Andreas Krause, Amin Karbasi

In many machine learning applications, one needs to interactively select a sequence of items (e. g., recommending movies based on a user's feedback) or make sequential decisions in a certain order (e. g., guiding an agent through a series of states).

Decision Making Link Prediction +1

Asynchronous Delay-Aware Accelerated Proximal Coordinate Descent for Nonconvex Nonsmooth Problems

no code implementations5 Feb 2019 Ehsan Kazemi, Liqiang Wang

To the best of our knowledge, we are the first to provide stochastic and deterministic accelerated extension of APCD algorithms for general nonconvex and nonsmooth problems ensuring that for both bounded delays and unbounded delays every limit point is a critical point.

Scalable Deletion-Robust Submodular Maximization: Data Summarization with Privacy and Fairness Constraints

no code implementations ICML 2018 Ehsan Kazemi, Morteza Zadimoghaddam, Amin Karbasi

Can we efficiently extract useful information from a large user-generated dataset while protecting the privacy of the users and/or ensuring fairness in representation?

Data Summarization Fairness

Data Summarization at Scale: A Two-Stage Submodular Approach

no code implementations ICML 2018 Marko Mitrovic, Ehsan Kazemi, Morteza Zadimoghaddam, Amin Karbasi

The sheer scale of modern datasets has resulted in a dire need for summarization techniques that identify representative elements in a dataset.

Data Summarization

Comparison Based Learning from Weak Oracles

no code implementations20 Feb 2018 Ehsan Kazemi, Lin Chen, Sanjoy Dasgupta, Amin Karbasi

More specifically, we aim at devising efficient algorithms to locate a target object in a database equipped with a dissimilarity metric via invocation of the weak comparison oracle.

Do Less, Get More: Streaming Submodular Maximization with Subsampling

no code implementations NeurIPS 2018 Moran Feldman, Amin Karbasi, Ehsan Kazemi

In this paper, we develop the first one-pass streaming algorithm for submodular maximization that does not evaluate the entire stream even once.

Video Summarization

Deletion-Robust Submodular Maximization at Scale

no code implementations20 Nov 2017 Ehsan Kazemi, Morteza Zadimoghaddam, Amin Karbasi

Can we efficiently extract useful information from a large user-generated dataset while protecting the privacy of the users and/or ensuring fairness in representation.

Fairness

Cannot find the paper you are looking for? You can Submit a new open access paper.