no code implementations • ICML 2020 • Ran Haba, Ehsan Kazemi, Moran Feldman, Amin Karbasi
Moreover, we propose the first streaming algorithms for monotone submodular maximization subject to $k$-extendible and $k$-system constraints.
no code implementations • 17 Oct 2024 • Ehsan Kazemi, Iman Soltani
Navigating autonomously in marine environments including dynamic and static obstacles, and strong flow disturbances, such as in high-flow rivers, poses significant challenges for USVs.
no code implementations • 22 Jul 2022 • Ehsan Kazemi
To alleviate this problem semi-supervised learning (SSL) leverages the knowledge of the classifier on the labeled domain and extrapolates it to the unlabeled domain which has a supposedly similar distribution as annotated data.
1 code implementation • 3 Dec 2021 • Bingbing Rao, Ehsan Kazemi, Yifan Ding, Devu M Shila, Frank M. Tucker, Liqiang Wang
Recently, data-driven inertial navigation approaches have demonstrated their capability of using well-trained neural networks to obtain accurate position estimates from inertial measurement units (IMU) measurements.
no code implementations • 6 Apr 2021 • Christopher Harshaw, Ehsan Kazemi, Moran Feldman, Amin Karbasi
We propose subsampling as a unified algorithmic technique for submodular maximization in centralized and online settings.
no code implementations • 15 Feb 2021 • Ehsan Kazemi, Thomas Kerdreux, Liquang Wang
White box adversarial perturbations are generated via iterative optimization algorithms most often by minimizing an adversarial loss on a $\ell_p$ neighborhood of the original image, the so-called distortion set.
no code implementations • 1 Jan 2021 • Ehsan Kazemi, Mohamed E. Hussein, Wael AbdAlmgaeed
We propose an ensemble-based defense against adversarial examples using distance map layers (DMLs).
no code implementations • 2 Jul 2020 • Ehsan Kazemi, Thomas Kerdreux, Liqiang Wang
White box adversarial perturbations are sought via iterative optimization algorithms most often minimizing an adversarial loss on a $l_p$ neighborhood of the original image, the so-called distortion set.
no code implementations • 16 Jun 2020 • Wenxin Li, Moran Feldman, Ehsan Kazemi, Amin Karbasi
In this paper, we provide the first deterministic algorithm that achieves the tight $1-1/e$ approximation guarantee for submodular maximization under a cardinality (size) constraint while making a number of queries that scales only linearly with the size of the ground set $n$.
1 code implementation • 15 Jun 2020 • Hongyan Chang, Ta Duy Nguyen, Sasi Kumar Murakonda, Ehsan Kazemi, Reza Shokri
Optimizing prediction accuracy can come at the expense of fairness.
no code implementations • 10 Feb 2020 • Ehsan Kazemi, Shervin Minaee, Moran Feldman, Amin Karbasi
In this paper, we propose scalable methods for maximizing a regularized submodular function $f = g - \ell$ expressed as the difference between a monotone submodular function $g$ and a modular function $\ell$.
no code implementations • NeurIPS 2020 • Ashwinkumar Badanidiyuru, Amin Karbasi, Ehsan Kazemi, Jan Vondrak
In this paper, we introduce a novel technique for constrained submodular maximization, inspired by barrier functions in continuous optimization.
1 code implementation • 9 Feb 2020 • Ran Haba, Ehsan Kazemi, Moran Feldman, Amin Karbasi
In this paper, we propose a novel framework that converts streaming algorithms for monotone submodular maximization into streaming algorithms for non-monotone submodular maximization.
no code implementations • 8 Jun 2019 • David Moxey, Chris D. Cantwell, Yan Bao, Andrea Cassinelli, Giacomo Castiglioni, Sehun Chun, Emilia Juda, Ehsan Kazemi, Kilian Lackhove, Julian Marcon, Gianmarco Mengaldo, Douglas Serson, Michael Turner, Hui Xu, Joaquim Peiró, Robert M. Kirby, Spencer J. Sherwin
Nektar++ is an open-source framework that provides a flexible, high-performance and scalable platform for the development of solvers for partial differential equations using the high-order spectral/$hp$ element method.
Mathematical Software Numerical Analysis Numerical Analysis Fluid Dynamics
no code implementations • 2 May 2019 • Ehsan Kazemi, Marko Mitrovic, Morteza Zadimoghaddam, Silvio Lattanzi, Amin Karbasi
We show how one can achieve the tight $(1/2)$-approximation guarantee with $O(k)$ shared memory while minimizing not only the required rounds of computations but also the total number of communicated bits.
1 code implementation • NeurIPS 2019 • Marko Mitrovic, Ehsan Kazemi, Moran Feldman, Andreas Krause, Amin Karbasi
In many machine learning applications, one needs to interactively select a sequence of items (e. g., recommending movies based on a user's feedback) or make sequential decisions in a certain order (e. g., guiding an agent through a series of states).
no code implementations • 5 Feb 2019 • Ehsan Kazemi, Liqiang Wang
To the best of our knowledge, we are the first to provide stochastic and deterministic accelerated extension of APCD algorithms for general nonconvex and nonsmooth problems ensuring that for both bounded delays and unbounded delays every limit point is a critical point.
no code implementations • 12 Nov 2018 • Soheil Ghili, Ehsan Kazemi, Amin Karbasi
How can we control for latent discrimination in predictive models?
no code implementations • ICML 2018 • Ehsan Kazemi, Morteza Zadimoghaddam, Amin Karbasi
Can we efficiently extract useful information from a large user-generated dataset while protecting the privacy of the users and/or ensuring fairness in representation?
no code implementations • ICML 2018 • Marko Mitrovic, Ehsan Kazemi, Morteza Zadimoghaddam, Amin Karbasi
The sheer scale of modern datasets has resulted in a dire need for summarization techniques that identify representative elements in a dataset.
no code implementations • NeurIPS 2018 • Moran Feldman, Amin Karbasi, Ehsan Kazemi
In this paper, we develop the first one-pass streaming algorithm for submodular maximization that does not evaluate the entire stream even once.
no code implementations • 20 Feb 2018 • Ehsan Kazemi, Lin Chen, Sanjoy Dasgupta, Amin Karbasi
More specifically, we aim at devising efficient algorithms to locate a target object in a database equipped with a dissimilarity metric via invocation of the weak comparison oracle.
no code implementations • 20 Nov 2017 • Ehsan Kazemi, Morteza Zadimoghaddam, Amin Karbasi
Can we efficiently extract useful information from a large user-generated dataset while protecting the privacy of the users and/or ensuring fairness in representation.