Search Results for author: Yingxia Shao

Found 23 papers, 9 papers with code

Don't Waste Your Bits! Squeeze Activations and Gradients for Deep Neural Networks via TinyScript

no code implementations ICML 2020 Fangcheng Fu, Yuzheng Hu, Yihan He, Jiawei Jiang, Yingxia Shao, Ce Zhang, Bin Cui

Recent years have witnessed intensive research interests on training deep neural networks (DNNs) more efficiently by quantization-based compression methods, which facilitate DNNs training in two ways: (1) activations are quantized to shrink the memory consumption, and (2) gradients are quantized to decrease the communication cost.

Quantization

Matching-oriented Embedding Quantization For Ad-hoc Retrieval

1 code implementation EMNLP 2021 Shitao Xiao, Zheng Liu, Yingxia Shao, Defu Lian, Xing Xie

In this work, we propose the Matching-oriented Product Quantization (MoPQ), where a novel objective Multinoulli Contrastive Loss (MCL) is formulated.

Quantization

Profiling and Evolution of Intellectual Property

no code implementations20 Apr 2022 Bowen Yu, Yingxia Shao, Ang Li

In recent years, with the rapid growth of Internet data, the number and types of scientific and technological resources are also rapidly expanding.

Research on Intellectual Property Resource Profile and Evolution Law

no code implementations13 Apr 2022 Yuhui Wang, Yingxia Shao, Ang Li

In the era of big data, intellectual property-oriented scientific and technological resources show the trend of large data scale, high information density and low value density, which brings severe challenges to the effective use of intellectual property resources, and the demand for mining hidden information in intellectual property is increasing.

Retrieval of Scientific and Technological Resources for Experts and Scholars

no code implementations13 Apr 2022 Suyu Ouyang, Yingxia Shao, Ang Li

The scientific and technological resources of experts and scholars are mainly composed of basic attributes and scientific research achievements.

Relation Extraction Representation Learning

Scientific and Technological Text Knowledge Extraction Method of based on Word Mixing and GRU

no code implementations31 Mar 2022 Suyu Ouyang, Yingxia Shao, Junping Du, Ang Li

The knowledge extraction task is to extract triple relations (head entity-relation-tail entity) from unstructured text data.

Named Entity Recognition

An Intellectual Property Entity Recognition Method Based on Transformer and Technological Word Information

no code implementations21 Mar 2022 Yuhui Wang, Junping Du, Yingxia Shao

This paper proposes a method for extracting intellectual property entities based on Transformer and technical word information , and provides accurate word vector representation in combination with the BERT language method.

Named Entity Recognition

Web Page Content Extraction Based on Multi-feature Fusion

no code implementations21 Mar 2022 Bowen Yu, Junping Du, Yingxia Shao

With the rapid growth of the number and types of web resources, there are still problems to be solved when using a single strategy to extract the text information of different pages.

Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate Heterogeneous Graph Neural Network

1 code implementation18 Feb 2022 Tianyu Zhao, Cheng Yang, Yibo Li, Quan Gan, Zhenyi Wang, Fengqi Liang, Huan Zhao, Yingxia Shao, Xiao Wang, Chuan Shi

Heterogeneous Graph Neural Network (HGNN) has been successfully employed in various tasks, but we cannot accurately know the importance of different design dimensions of HGNNs due to diverse architectures and applied scenarios.

Progressively Optimized Bi-Granular Document Representation for Scalable Embedding Based Retrieval

2 code implementations14 Jan 2022 Shitao Xiao, Zheng Liu, Weihao Han, Jianjin Zhang, Yingxia Shao, Defu Lian, Chaozhuo Li, Hao Sun, Denvy Deng, Liangjie Zhang, Qi Zhang, Xing Xie

In this work, we tackle this problem with Bi-Granular Document Representation, where the lightweight sparse embeddings are indexed and standby in memory for coarse-grained candidate search, and the heavyweight dense embeddings are hosted in disk for fine-grained post verification.

Quantization

LECF: Recommendation via Learnable Edge Collaborative Filtering

1 code implementation Science China Information Sciences 2021 Shitao Xiao, Yingxia Shao, Yawen Li, Hongzhi Yin, Yanyan Shen & Bin Cui

In this paper, we model an interaction between user and item as an edge and propose a novel CF framework, called learnable edge collaborative filtering (LECF).

Collaborative Filtering

Self-Supervised Graph Co-Training for Session-based Recommendation

1 code implementation24 Aug 2021 Xin Xia, Hongzhi Yin, Junliang Yu, Yingxia Shao, Lizhen Cui

In this paper, for informative session-based data augmentation, we combine self-supervised learning with co-training, and then develop a framework to enhance session-based recommendation.

Contrastive Learning Data Augmentation +2

Memory-aware framework for fast and scalable second-order random walk over billion-edge natural graphs

no code implementations The VLDB Journal 2021 Yingxia Shao, Shiyue Huang, Yawen Li, Xupeng Miao, Bin Cui & Lei Chen

In this paper, to clearly compare the efficiency of various node sampling methods, we first design a cost model and propose two new node sampling methods: one follows the acceptance-rejection paradigm to achieve a better balance between memory and time cost, and the other is optimized for fast sampling the skewed probability distributions existed in natural graphs.

Community Detection Graph Embedding

Matching-oriented Product Quantization For Ad-hoc Retrieval

2 code implementations16 Apr 2021 Shitao Xiao, Zheng Liu, Yingxia Shao, Defu Lian, Xing Xie

In this work, we propose the Matching-oriented Product Quantization (MoPQ), where a novel objective Multinoulli Contrastive Loss (MCL) is formulated.

Quantization

Training Large-Scale News Recommenders with Pretrained Language Models in the Loop

no code implementations18 Feb 2021 Shitao Xiao, Zheng Liu, Yingxia Shao, Tao Di, Xing Xie

Secondly, it improves the data efficiency of the training workflow, where non-informative data can be eliminated from encoding.

News Recommendation Pretrained Language Models +1

Efficient Automatic CASH via Rising Bandits

no code implementations8 Dec 2020 Yang Li, Jiawei Jiang, Jinyang Gao, Yingxia Shao, Ce Zhang, Bin Cui

In this framework, the BO methods are used to solve the HPO problem for each ML algorithm separately, incorporating a much smaller hyperparameter space for BO methods.

Hyperparameter Optimization Multi-Armed Bandits

UniNet: Scalable Network Representation Learning with Metropolis-Hastings Sampling

1 code implementation10 Oct 2020 Xingyu Yao, Yingxia Shao, Bin Cui, Lei Chen

Finally, with the new edge sampler and random walk model abstraction, we carefully implement a scalable NRL framework called UniNet.

Representation Learning

DeGNN: Characterizing and Improving Graph Neural Networks with Graph Decomposition

no code implementations10 Oct 2019 Xupeng Miao, Nezihe Merve Gürel, Wentao Zhang, Zhichao Han, Bo Li, Wei Min, Xi Rao, Hansheng Ren, Yinan Shan, Yingxia Shao, Yujie Wang, Fan Wu, Hui Xue, Yaming Yang, Zitao Zhang, Yang Zhao, Shuai Zhang, Yujing Wang, Bin Cui, Ce Zhang

Despite the wide application of Graph Convolutional Network (GCN), one major limitation is that it does not benefit from the increasing depth and suffers from the oversmoothing problem.

An Experimental Evaluation of Large Scale GBDT Systems

no code implementations3 Jul 2019 Fangcheng Fu, Jiawei Jiang, Yingxia Shao, Bin Cui

Gradient boosting decision tree (GBDT) is a widely-used machine learning algorithm in both data analytic competitions and real-world industrial applications.

NSCaching: Simple and Efficient Negative Sampling for Knowledge Graph Embedding

5 code implementations16 Dec 2018 Yongqi Zhang, Quanming Yao, Yingxia Shao, Lei Chen

Negative sampling, which samples negative triplets from non-observed ones in the training data, is an important step in KG embedding.

Knowledge Graph Embedding Link Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.