Search Results for author: Lingxiao Li

Found 14 papers, 9 papers with code

Debiased Distribution Compression

no code implementations18 Apr 2024 Lingxiao Li, Raaz Dwivedi, Lester Mackey

Modern compression methods can summarize a target distribution $\mathbb{P}$ more succinctly than i. i. d.

LLsM: Generative Linguistic Steganography with Large Language Model

no code implementations28 Jan 2024 Yihao Wang, Ruiqi Song, Ru Zhang, Jianyi Liu, Lingxiao Li

Regarding open-source LLMs, we reconstruct the token generator of LLM to the "stego generator" so that it can control the generation of stego based on the secret.

Language Modelling Large Language Model +2

Monitoring and Adapting ML Models on Mobile Devices

no code implementations12 May 2023 Wei Hao, Zixi Wang, Lauren Hong, Lingxiao Li, Nader Karayanni, Chengzhi Mao, Junfeng Yang, Asaf Cidon

ML models are increasingly being pushed to mobile devices, for low-latency inference and offline operation.

Sampling with Mollified Interaction Energy Descent

2 code implementations24 Oct 2022 Lingxiao Li, Qiang Liu, Anna Korba, Mikhail Yurochkin, Justin Solomon

These energies rely on mollifier functions -- smooth approximations of the Dirac delta originated from PDE theory.

Learning Proximal Operators to Discover Multiple Optima

1 code implementation28 Jan 2022 Lingxiao Li, Noam Aigerman, Vladimir G. Kim, Jiajin Li, Kristjan Greenewald, Mikhail Yurochkin, Justin Solomon

We present an end-to-end method to learn the proximal operator of a family of training problems so that multiple local minima can be quickly obtained from initial guesses by iterating the learned operator, emulating the proximal-point algorithm that has fast convergence.

object-detection Object Detection

Wasserstein Iterative Networks for Barycenter Estimation

1 code implementation28 Jan 2022 Alexander Korotin, Vage Egiazarian, Lingxiao Li, Evgeny Burnaev

Wasserstein barycenters have become popular due to their ability to represent the average of probability measures in a geometrically meaningful way.

Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark

6 code implementations NeurIPS 2021 Alexander Korotin, Lingxiao Li, Aude Genevay, Justin Solomon, Alexander Filippov, Evgeny Burnaev

Despite the recent popularity of neural network-based solvers for optimal transport (OT), there is no standard quantitative way to evaluate their performance.

Image Generation

Large-Scale Wasserstein Gradient Flows

3 code implementations NeurIPS 2021 Petr Mokrov, Alexander Korotin, Lingxiao Li, Aude Genevay, Justin Solomon, Evgeny Burnaev

Specifically, Fokker-Planck equations, which model the diffusion of probability measures, can be understood as gradient descent over entropy functionals in Wasserstein space.

Continuous Wasserstein-2 Barycenter Estimation without Minimax Optimization

2 code implementations ICLR 2021 Alexander Korotin, Lingxiao Li, Justin Solomon, Evgeny Burnaev

Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.

Continuous Regularized Wasserstein Barycenters

1 code implementation NeurIPS 2020 Lingxiao Li, Aude Genevay, Mikhail Yurochkin, Justin Solomon

Leveraging a new dual formulation for the regularized Wasserstein barycenter problem, we introduce a stochastic algorithm that constructs a continuous approximation of the barycenter.

Comparing Alternative Route Planning Techniques: A Comparative User Study on Melbourne, Dhaka and Copenhagen Road Networks

no code implementations15 Jun 2020 Lingxiao Li, Muhammad Aamir Cheema, Hua Lu, Mohammed Eunus Ali, Adel N. Toosi

Motivated by this, in this paper, we present a user study conducted on the road networks of Melbourne, Dhaka and Copenhagen that compares the quality (as perceived by the users) of the alternative routes generated by four of the most popular existing approaches including the routes provided by Google Maps.

SentPWNet: A Unified Sentence Pair Weighting Network for Task-specific Sentence Embedding

no code implementations22 May 2020 Li Zhang, Han Wang, Lingxiao Li

Our model, SentPWNet, exploits the neighboring spatial distribution of each sentence as locality weight to indicate the informative level of sentence pair.

Metric Learning Sentence +3

Supervised Fitting of Geometric Primitives to 3D Point Clouds

2 code implementations CVPR 2019 Lingxiao Li, Minhyuk Sung, Anastasia Dubrovina, Li Yi, Leonidas Guibas

Fitting geometric primitives to 3D point cloud data bridges a gap between low-level digitized 3D data and high-level structural information on the underlying 3D shapes.

Shape Representation Of 3D Point Clouds

Cannot find the paper you are looking for? You can Submit a new open access paper.