Search Results for author: Jiajin Li

Found 22 papers, 10 papers with code

Spurious Stationarity and Hardness Results for Mirror Descent

no code implementations11 Apr 2024 He Chen, Jiajin Li, Anthony Man-Cho So

Despite the considerable success of Bregman proximal-type algorithms, such as mirror descent, in machine learning, a critical question remains: Can existing stationarity measures, often based on Bregman divergence, reliably distinguish between stationary and non-stationary points?

Automatic Outlier Rectification via Optimal Transport

no code implementations21 Mar 2024 Jose Blanchet, Jiajin Li, Markus Pelger, Greg Zanotti

In this paper, we propose a novel conceptual framework to detect outliers using optimal transport with a concave cost function.

Outlier Detection

Unifying Distributionally Robust Optimization via Optimal Transport Theory

no code implementations10 Aug 2023 Jose Blanchet, Daniel Kuhn, Jiajin Li, Bahar Taskesen

In the past few years, there has been considerable interest in two prominent approaches for Distributionally Robust Optimization (DRO): Divergence-based and Wasserstein-based methods.

A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein in Graph Data

2 code implementations12 Mar 2023 Jiajin Li, Jianheng Tang, Lemin Kong, Huikang Liu, Jia Li, Anthony Man-Cho So, Jose Blanchet

This observation allows us to provide an approximation bound for the distance between the fixed-point set of BAPG and the critical point set of GW.

Computational Efficiency

Outlier-Robust Gromov-Wasserstein for Graph Data

1 code implementation NeurIPS 2023 Lemin Kong, Jiajin Li, Jianheng Tang, Anthony Man-Cho So

Gromov-Wasserstein (GW) distance is a powerful tool for comparing and aligning probability distributions supported on different metric spaces.

Graph Learning

Robust Attributed Graph Alignment via Joint Structure Learning and Optimal Transport

1 code implementation30 Jan 2023 Jianheng Tang, Weiqi Zhang, Jiajin Li, Kangfei Zhao, Fugee Tsung, Jia Li

As the graphs to be aligned are usually constructed from different sources, the inconsistency issues of structures and features between two graphs are ubiquitous in real-world applications.

Graph Embedding

Tikhonov Regularization is Optimal Transport Robust under Martingale Constraints

no code implementations4 Oct 2022 Jiajin Li, Sirui Lin, Jose Blanchet, Viet Anh Nguyen

Distributionally robust optimization has been shown to offer a principled way to regularize learning models.

Nonsmooth Nonconvex-Nonconcave Minimax Optimization: Primal-Dual Balancing and Iteration Complexity Analysis

no code implementations22 Sep 2022 Jiajin Li, Linglingzhi Zhu, Anthony Man-Cho So

Specifically, we consider the setting where the primal function has a nonsmooth composite structure and the dual function possesses the Kurdyka-Lojasiewicz (KL) property with exponent $\theta \in [0, 1)$.

Rethinking Graph Neural Networks for Anomaly Detection

1 code implementation31 May 2022 Jianheng Tang, Jiajin Li, Ziqi Gao, Jia Li

Graph Neural Networks (GNNs) are widely applied for graph anomaly detection.

Graph Anomaly Detection

Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph Data

no code implementations17 May 2022 Jiajin Li, Jianheng Tang, Lemin Kong, Huikang Liu, Jia Li, Anthony Man-Cho So, Jose Blanchet

In this paper, we study the design and analysis of a class of efficient algorithms for computing the Gromov-Wasserstein (GW) distance tailored to large-scale graph learning tasks.

Graph Learning

Learning Proximal Operators to Discover Multiple Optima

1 code implementation28 Jan 2022 Lingxiao Li, Noam Aigerman, Vladimir G. Kim, Jiajin Li, Kristjan Greenewald, Mikhail Yurochkin, Justin Solomon

We present an end-to-end method to learn the proximal operator of a family of training problems so that multiple local minima can be quickly obtained from initial guesses by iterating the learned operator, emulating the proximal-point algorithm that has fast convergence.

object-detection Object Detection

Modified Frank Wolfe in Probability Space

no code implementations NeurIPS 2021 Carson Kent, Jiajin Li, Jose Blanchet, Peter W. Glynn

We propose a novel Frank-Wolfe (FW) procedure for the optimization of infinite-dimensional functionals of probability measures - a task which arises naturally in a wide range of areas including statistical learning (e. g. variational inference) and artificial intelligence (e. g. generative adversarial networks).

Variational Inference

Deconvolutional Networks on Graph Data

no code implementations NeurIPS 2021 Jia Li, Jiajin Li, Yang Liu, Jianwei Yu, Yueting Li, Hong Cheng

In this paper, we consider an inverse problem in graph learning domain -- ``given the graph representations smoothed by Graph Convolutional Network (GCN), how can we reconstruct the input graph signal?"

Graph Learning Imputation

Fast Epigraphical Projection-based Incremental Algorithms for Wasserstein Distributionally Robust Support Vector Machine

1 code implementation NeurIPS 2020 Jiajin Li, Caihua Chen, Anthony Man-Cho So

In this paper, we focus on a family of Wasserstein distributionally robust support vector machine (DRSVM) problems and propose two novel epigraphical projection-based incremental algorithms to solve them.

Dirichlet Graph Variational Autoencoder

1 code implementation NeurIPS 2020 Jia Li, Tomasyu Yu, Jiajin Li, Honglei Zhang, Kangfei Zhao, Yu Rong, Hong Cheng, Junzhou Huang

In this work, we present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.

Clustering Graph Clustering +1

Understanding Notions of Stationarity in Non-Smooth Optimization

no code implementations26 Jun 2020 Jiajin Li, Anthony Man-Cho So, Wing-Kin Ma

Many contemporary applications in signal processing and machine learning give rise to structured non-convex non-smooth optimization problems that can often be tackled by simple iterative methods quite effectively.

The Gambler's Problem and Beyond

no code implementations ICLR 2020 Baoxiang Wang, Shuai Li, Jiajin Li, Siu On Chan

We analyze the Gambler's problem, a simple reinforcement learning problem where the gambler has the chance to double or lose the bets until the target is reached.

Q-Learning reinforcement-learning +1

A First-Order Algorithmic Framework for Distributionally Robust Logistic Regression

1 code implementation NeurIPS 2019 Jiajin Li, Sen Huang, Anthony Man-Cho So

In this paper, we take a first step towards resolving the above difficulty by developing a first-order algorithmic framework for tackling a class of Wasserstein distance-based distributionally robust logistic regression (DRLR) problem.

regression

A First-Order Algorithmic Framework for Wasserstein Distributionally Robust Logistic Regression

1 code implementation28 Oct 2019 Jiajin Li, Sen Huang, Anthony Man-Cho So

In this paper, we take a first step towards resolving the above difficulty by developing a first-order algorithmic framework for tackling a class of Wasserstein distance-based distributionally robust logistic regression (DRLR) problem.

regression

Policy Optimization with Second-Order Advantage Information

1 code implementation9 May 2018 Jiajin Li, Baoxiang Wang

Policy optimization on high-dimensional continuous control tasks exhibits its difficulty caused by the large variance of the policy gradient estimators.

Continuous Control

Cannot find the paper you are looking for? You can Submit a new open access paper.