Search Results for author: Li-Ping Liu

Found 28 papers, 12 papers with code

Reason out Your Layout: Evoking the Layout Master from Large Language Models for Text-to-Image Synthesis

no code implementations28 Nov 2023 Xiaohui Chen, Yongfei Liu, Yingxiang Yang, Jianbo Yuan, Quanzeng You, Li-Ping Liu, Hongxia Yang

Recent advancements in text-to-image (T2I) generative models have shown remarkable capabilities in producing diverse and imaginative visuals based on text prompts.

Image Generation

Bayesian Conditional Diffusion Models for Versatile Spatiotemporal Turbulence Generation

no code implementations14 Nov 2023 Han Gao, Xu Han, Xiantao Fan, Luning Sun, Li-Ping Liu, Lian Duan, Jian-Xun Wang

A notable feature of our approach is the method proposed for long-span flow sequence generation, which is based on autoregressive gradient-based conditional sampling, eliminating the need for cumbersome retraining processes.

EDGE++: Improved Training and Sampling of EDGE

no code implementations22 Oct 2023 Mingyang Wu, Xiaohui Chen, Li-Ping Liu

Recently developed deep neural models like NetGAN, CELL, and Variational Graph Autoencoders have made progress but face limitations in replicating key graph statistics on generating large graphs.

Computational Efficiency Denoising +1

On Separate Normalization in Self-supervised Transformers

1 code implementation NeurIPS 2023 Xiaohui Chen, Yinkai Wang, Yuanqi Du, Soha Hassoun, Li-Ping Liu

Self-supervised training methods for transformers have demonstrated remarkable performance across various domains.

Kriging Convolutional Networks

1 code implementation15 Jun 2023 Gabriel Appleby, Linfeng Liu, Li-Ping Liu

Spatial interpolation is a class of estimation problems where locations with known values are used to estimate values at other locations, with an emphasis on harnessing spatial locality and trends.

Spatial Interpolation

Graph-Based Model-Agnostic Data Subsampling for Recommendation Systems

no code implementations25 May 2023 Xiaohui Chen, Jiankai Sun, Taiqing Wang, Ruocheng Guo, Li-Ping Liu, Aonan Zhang

Most subsampling methods are model-based and often require a pre-trained pilot model to measure data importance via e. g. sample hardness.

Recommendation Systems

Efficient and Degree-Guided Graph Generation via Discrete Diffusion Modeling

1 code implementation6 May 2023 Xiaohui Chen, Jiaxing He, Xu Han, Li-Ping Liu

The empirical study shows that EDGE is much more efficient than competing methods and can generate large graphs with thousands of nodes.

Denoising Graph Generation

NVDiff: Graph Generation through the Diffusion of Node Vectors

no code implementations19 Nov 2022 Xiaohui Chen, Yukun Li, Aonan Zhang, Li-Ping Liu

Learning to generate graphs is challenging as a graph is a set of pairwise connected, unordered nodes encoding complex combinatorial structures.

Graph Generation

Towards Accurate Subgraph Similarity Computation via Neural Graph Pruning

1 code implementation19 Oct 2022 Linfeng Liu, Xu Han, Dawei Zhou, Li-Ping Liu

In this work, we convert graph pruning to a problem of node relabeling and then relax it to a differentiable problem.

NovelCraft: A Dataset for Novelty Detection and Discovery in Open Worlds

2 code implementations23 Jun 2022 Patrick Feeney, Sarah Schneider, Panagiotis Lymperopoulos, Li-Ping Liu, Matthias Scheutz, Michael C. Hughes

In order for artificial agents to successfully perform tasks in changing environments, they must be able to both detect and adapt to novelty.

Novelty Detection

Ensemble Spectral Prediction (ESP) Model for Metabolite Annotation

no code implementations25 Mar 2022 Xinmeng Li, Hao Zhu, Li-Ping Liu, Soha Hassoun

We show that annotation performance, for ESP and other models, is a strong function of the number of molecules in the candidate set and their similarity to the target molecule.

Predicting Physics in Mesh-reduced Space with Temporal Attention

no code implementations ICLR 2022 Xu Han, Han Gao, Tobias Pfaff, Jian-Xun Wang, Li-Ping Liu

Graph-based next-step prediction models have recently been very successful in modeling complex high-dimensional physical systems on irregular meshes.

Boost-RS: Boosted Embeddings for Recommender Systems and its Application to Enzyme-Substrate Interaction Prediction

1 code implementation28 Sep 2021 Xinmeng Li, Li-Ping Liu, Soha Hassoun

We show that each of our auxiliary tasks boosts learning of the embedding vectors, and that contrastive learning using Boost-RS outperforms attribute concatenation and multi-label learning.

Attribute Auxiliary Learning +4

Ladder Polynomial Neural Networks

no code implementations25 Jun 2021 Li-Ping Liu, Ruiyuan Gu, Xiaozhe Hu

Particularly this work constructs polynomial feedforward neural networks using the product activation, a new activation function constructed from multiplications.

Stochastic Iterative Graph Matching

1 code implementation4 Jun 2021 Linfeng Liu, Michael C. Hughes, Soha Hassoun, Li-Ping Liu

In this work, we propose a new model, Stochastic Iterative Graph MAtching (SIGMA), to address the graph matching problem.

Graph Matching Stochastic Optimization

Modeling Graph Node Correlations with Neighbor Mixture Models

no code implementations29 Mar 2021 Linfeng Liu, Michael C. Hughes, Li-Ping Liu

We propose a new model, the Neighbor Mixture Model (NMM), for modeling node labels in a graph.

Image Denoising Link Prediction +2

GAN Ensemble for Anomaly Detection

1 code implementation14 Dec 2020 Xu Han, Xiaohui Chen, Li-Ping Liu

Motivated by the observation that GAN ensembles often outperform single GANs in generation tasks, we propose to construct GAN ensembles for anomaly detection.

Anomaly Detection

Learning graph representations of biochemical networks and its application to enzymatic link prediction

1 code implementation9 Feb 2020 Julie Jiang, Li-Ping Liu, Soha Hassoun

We develop in this work a technique, Enzymatic Link Prediction (ELP), for predicting the likelihood of an enzymatic transformation between two molecules.

Graph Embedding Graph Learning +1

Pathway-Activity Likelihood Analysis and Metabolite Annotation for Untargeted Metabolomics using Probabilistic Modeling

1 code implementation12 Dec 2019 Ramtin Hosseini, Neda Hassanpour, Li-Ping Liu, Soha Hassoun

Annotation results are in agreement to those obtained using other tools that utilize additional information in the form of spectral signatures.

SecureGBM: Secure Multi-Party Gradient Boosting

no code implementations27 Nov 2019 Zhi Fengy, Haoyi Xiong, Chuanyuan Song, Sijia Yang, Baoxin Zhao, Licheng Wang, Zeyu Chen, Shengwen Yang, Li-Ping Liu, Jun Huan

Our experiments using the real-world data showed that SecureGBM can well secure the communication and computation of LightGBM training and inference procedures for the both parties while only losing less than 3% AUC, using the same number of iterations for gradient boosting, on a wide range of benchmark datasets.

Parallel Distributed Logistic Regression for Vertical Federated Learning without Third-Party Coordinator

no code implementations22 Nov 2019 Shengwen Yang, Bing Ren, Xuhui Zhou, Li-Ping Liu

The system is built on the pa-rameter server architecture and aims to speed up the model training via utilizing a cluster of servers in case of large volume of training data.

regression Transfer Learning +1

Non-Parametric Variational Inference with Graph Convolutional Networks for Gaussian Processes

no code implementations8 Sep 2018 Linfeng Liu, Li-Ping Liu

Many recent inference methods approximate the posterior distribution with a simpler distribution defined on a small number of inducing points.

Gaussian Processes Stochastic Optimization +1

Zero-Inflated Exponential Family Embeddings

no code implementations ICML 2017 Li-Ping Liu, David M. Blei

In this paper, we develop zero-inflated embeddings, a new embedding method that is designed to learn from sparse observations.

Word Embeddings

Transductive Optimization of Top k Precision

no code implementations20 Oct 2015 Li-Ping Liu, Thomas G. Dietterich, Nan Li, Zhi-Hua Zhou

This paper introduces a new approach, Transductive Top K (TTK), that seeks to minimize the hinge loss over all training instances under the constraint that exactly $k$ test instances are predicted as positive.

Binary Classification Information Retrieval +1

Gaussian Approximation of Collective Graphical Models

no code implementations20 May 2014 Li-Ping Liu, Daniel Sheldon, Thomas G. Dietterich

The Collective Graphical Model (CGM) models a population of independent and identically distributed individuals when only collective statistics (i. e., counts of individuals) are observed.

Cannot find the paper you are looking for? You can Submit a new open access paper.