no code implementations • 25 May 2023 • Michael Kounavis, Ousmane Dia, Ilqar Ramazanli
We conclude that influence functions can be made practical, even for large scale machine learning systems, and that influence values can be taken into account by algorithms that selectively remove training points, as part of the learning process.
no code implementations • 23 Mar 2022 • Ilqar Ramazanli
The second one is any two entry has different observation cost, despite being the same or different columns.
no code implementations • 16 Mar 2022 • Ilqar Ramazanli
In this paper we focus on adaptive matrix completion with bounded type of noise.
no code implementations • 15 Mar 2022 • Ilqar Ramazanli
In this paper, we are using the idea of sparsity-number and propose and single-phase column space recovery algorithm which can be extended to two-phase exact matrix completion algorithm.
no code implementations • 1 Mar 2022 • Ilqar Ramazanli
We study the distribution regression problem assuming the distribution of distributions has a doubling measure larger than one.
no code implementations • 20 Feb 2020 • Ilqar Ramazanli, Han Nguyen, Hai Pham, Sashank J. Reddi, Barnabas Poczos
It often leads to the dependence of convergence rate on maximum Lipschitz constant of gradients across the devices.
no code implementations • 6 Feb 2020 • Ilqar Ramazanli, Barnabas Poczos
We study the problem of exact completion for $m \times n$ sized matrix of rank $r$ with the adaptive sampling method.