Search Results for author: Xiaopeng Li

Found 12 papers, 2 papers with code

Hard instance learning for quantum adiabatic prime factorization

no code implementations10 Oct 2021 Jian Lin, Zhengfeng Zhang, Junping Zhang, Xiaopeng Li

Prime factorization is a difficult problem with classical computing, whose exponential hardness is the foundation of Rivest-Shamir-Adleman (RSA) cryptography.

Transfer Learning

Uncertainty Set Prediction of Aggregated Wind Power Generation based on Bayesian LSTM and Spatio-Temporal Analysis

no code implementations7 Oct 2021 Xiaopeng Li, Jiang Wu, Zhanbo Xu, Kun Liu, Jun Yu, Xiaohong Guan

This paper focuses on the uncertainty set prediction of the aggregated generation of geographically distributed wind farms.

Quantum Adiabatic Doping for Atomic Fermi-Hubbard Quantum Simulations

no code implementations5 Jan 2021 Jue Nan, Jian Lin, Yuchen Luo, Bo Zhao, Xiaopeng Li

Its feasibility has been demonstrated with numerical simulations of the adiabatic preparation for certain incommensurate particle-doping fractions, where the major problem to circumvent is the atomic localization in the incommensurate lattice.

Quantum Gases Strongly Correlated Electrons Quantum Physics

Not All Attention Is Needed: Gated Attention Network for Sequence Data

1 code implementation1 Dec 2019 Lanqing Xue, Xiaopeng Li, Nevin L. Zhang

Attention mechanisms compute input-dependent dynamic attention weights for aggregating a sequence of hidden states.

text-classification Text Classification

Learning to Abstract for Memory-augmented Conversational Response Generation

1 code implementation ACL 2019 Zhiliang Tian, Wei Bi, Xiaopeng Li, Nevin L. Zhang

In this work, we propose a memory-augmented generative model, which learns to abstract from the training corpus and saves the useful information to the memory to assist the response generation.

Conversational Response Generation Informativeness +1

Quantum Adiabatic Algorithm Design using Reinforcement Learning

no code implementations27 Dec 2018 Jian Lin, Zhong Yuan Lai, Xiaopeng Li

We benchmark this approach in Grover-search and 3-SAT problems, and find that the adiabatic-algorithm obtained by our RL approach leads to significant improvement in the resultant success probability.

reinforcement-learning

Review Helpfulness Assessment based on Convolutional Neural Network

no code implementations27 Aug 2018 Xianshan Qu, Xiaopeng Li, John R. Rose

In this paper we describe the implementation of a convolutional neural network (CNN) used to assess online review helpfulness.

Neural Machine Translation Inspired Binary Code Similarity Comparison beyond Function Pairs

no code implementations8 Aug 2018 Fei Zuo, Xiaopeng Li, Patrick Young, Lannan Luo, Qiang Zeng, Zhexin Zhang

The solutions to these two problems have many applications, such as cross-architecture vulnerability discovery and code plagiarism detection.

Machine Translation Translation

Learning Sparse Deep Feedforward Networks via Tree Skeleton Expansion

no code implementations16 Mar 2018 Zhourong Chen, Xiaopeng Li, Nevin L. Zhang

An important characteristic of FNN structures learned this way is that they are sparse.

Learning Latent Superstructures in Variational Autoencoders for Deep Multidimensional Clustering

no code implementations ICLR 2019 Xiaopeng Li, Zhourong Chen, Leonard K. M. Poon, Nevin L. Zhang

We investigate a variant of variational autoencoders where there is a superstructure of discrete latent variables on top of the latent features.

Building Sparse Deep Feedforward Networks using Tree Receptive Fields

no code implementations14 Mar 2018 Xiaopeng Li, Zhourong Chen, Nevin L. Zhang

We use Chow-Liu's algorithm to learn a tree-structured probabilistic model for the units at the current level, use the tree to identify subsets of units that are strongly correlated, and introduce a new unit with receptive field over the subsets.

Learning Parsimonious Deep Feed-forward Networks

no code implementations ICLR 2018 Zhourong Chen, Xiaopeng Li, Nevin L. Zhang

Convolutional neural networks and recurrent neural networks are designed with network structures well suited to the nature of spacial and sequential data respectively.

Cannot find the paper you are looking for? You can Submit a new open access paper.