Search Results for author: Jiahao Liu

Found 6 papers, 1 papers with code

AutoDisc: Automatic Distillation Schedule for Large Language Model Compression

no code implementations29 May 2022 Chen Zhang, Yang Yang, Qifan Wang, Jiahao Liu, Jingang Wang, Wei Wu, Dawei Song

As a connection, the scale and the performance of the teacher assistant is crucial for transferring the knowledge from the teacher to the student.

Knowledge Distillation Language Modelling +1

GNN-encoder: Learning a Dual-encoder Architecture via Graph Neural Networks for Passage Retrieval

no code implementations18 Apr 2022 Jiduan Liu, Jiahao Liu, Yang Yang, Jingang Wang, Wei Wu, Dongyan Zhao, Rui Yan

To enhance the performance of dense retrieval models without loss of efficiency, we propose a GNN-encoder model in which query (passage) information is fused into passage (query) representations via graph neural networks that are constructed by queries and their top retrieved passages.

Passage Retrieval

VECO: Variable and Flexible Cross-lingual Pre-training for Language Understanding and Generation

1 code implementation ACL 2021 Fuli Luo, Wei Wang, Jiahao Liu, Yijia Liu, Bin Bi, Songfang Huang, Fei Huang, Luo Si

Existing work in multilingual pretraining has demonstrated the potential of cross-lingual transferability by training a unified Transformer encoder for multiple languages.

Language Modelling Question Answering +1

VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation

no code implementations28 Sep 2020 Fuli Luo, Wei Wang, Jiahao Liu, Yijia Liu, Bin Bi, Songfang Huang, Fei Huang, Luo Si

Recent studies about learning multilingual representations have achieved significant performance gains across a wide range of downstream cross-lingual tasks.

Language Modelling Masked Language Modeling +2

Community-preserving Graph Convolutions for Structural and Functional Joint Embedding of Brain Networks

no code implementations8 Nov 2019 Jiahao Liu, Guixiang Ma, Fei Jiang, Chun-Ta Lu, Philip S. Yu, Ann B. Ragin

Specifically, we use graph convolutions to learn the structural and functional joint embedding, where the graph structure is defined with structural connectivity and node features are from the functional connectivity.

MULTI-VIEW LEARNING

A Planning based Framework for Essay Generation

no code implementations18 Dec 2015 Bing Qin, Duyu Tang, Xinwei Geng, Dandan Ning, Jiahao Liu, Ting Liu

Generating an article automatically with computer program is a challenging task in artificial intelligence and natural language processing.

Natural Language Processing

Cannot find the paper you are looking for? You can Submit a new open access paper.