Search Results for author: Xinyan Dai

Found 8 papers, 6 papers with code

Norm-Ranging LSH for Maximum Inner Product Search

1 code implementation NeurIPS 2018 Xiao Yan, Jinfeng Li, Xinyan Dai, Hongzhi Chen, James Cheng

Neyshabur and Srebro proposed Simple-LSH, which is the state-of-the-art hashing method for maximum inner product search (MIPS) with performance guarantee.

Norm-Range Partition: A Universal Catalyst for LSH based Maximum Inner Product Search (MIPS)

1 code implementation22 Oct 2018 Xiao Yan, Xinyan Dai, Jie Liu, Kaiwen Zhou, James Cheng

Recently, locality sensitive hashing (LSH) was shown to be effective for MIPS and several algorithms including $L_2$-ALSH, Sign-ALSH and Simple-LSH have been proposed.

PMD: An Optimal Transportation-based User Distance for Recommender Systems

no code implementations10 Sep 2019 Yitong Meng, Xinyan Dai, Xiao Yan, James Cheng, Weiwen Liu, Benben Liao, Jun Guo, Guangyong Chen

Collaborative filtering, a widely-used recommendation technique, predicts a user's preference by aggregating the ratings from similar users.

Collaborative Filtering Recommendation Systems

Understanding and Improving Proximity Graph based Maximum Inner Product Search

no code implementations30 Sep 2019 Jie Liu, Xiao Yan, Xinyan Dai, Zhirong Li, James Cheng, Ming-Chang Yang

Then we explain the good performance of ip-NSW as matching the norm bias of the MIPS problem - large norm items have big in-degrees in the ip-NSW proximity graph and a walk on the graph spends the majority of computation on these items, thus effectively avoids unnecessary computation on small norm items.

Norm-Explicit Quantization: Improving Vector Quantization for Maximum Inner Product Search

2 code implementations12 Nov 2019 Xinyan Dai, Xiao Yan, Kelvin K. W. Ng, Jie Liu, James Cheng

In this paper, we present a new angle to analyze the quantization error, which decomposes the quantization error into norm error and direction error.

Data Compression Quantization

Hyper-Sphere Quantization: Communication-Efficient SGD for Federated Learning

1 code implementation12 Nov 2019 Xinyan Dai, Xiao Yan, Kaiwen Zhou, Han Yang, Kelvin K. W. Ng, James Cheng, Yu Fan

In particular, at the high compression ratio end, HSQ provides a low per-iteration communication cost of $O(\log d)$, which is favorable for federated learning.

Federated Learning Quantization

Convolutional Embedding for Edit Distance

2 code implementations31 Jan 2020 Xinyan Dai, Xiao Yan, Kaiwen Zhou, Yuxuan Wang, Han Yang, James Cheng

Edit-distance-based string similarity search has many applications such as spell correction, data de-duplication, and sequence alignment.

Self-Enhanced GNN: Improving Graph Neural Networks Using Model Outputs

1 code implementation18 Feb 2020 Han Yang, Xiao Yan, Xinyan Dai, Yongqiang Chen, James Cheng

In this paper, we propose self-enhanced GNN (SEG), which improves the quality of the input data using the outputs of existing GNN models for better performance on semi-supervised node classification.

General Classification Node Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.