Search Results for author: Peng Shi

Found 27 papers, 12 papers with code

Better Language Model with Hypernym Class Prediction

1 code implementation ACL 2022 He Bai, Tong Wang, Alessandro Sordoni, Peng Shi

Class-based language models (LMs) have been long devised to address context sparsity in $n$-gram LMs.

Language Modelling

Semi-global Periodic Event-triggered Output Regulation for Nonlinear Multi-agent Systems

no code implementations4 Jan 2022 Shiqi Zheng, Peng Shi, Huiyan Zhang

This study focuses on periodic event-triggered (PET) cooperative output regulation problem for a class of nonlinear multi-agent systems.

Hierarchical Character Tagger for Short Text Spelling Error Correction

no code implementations WNUT (ACL) 2021 Mengyi Gao, Canran Xu, Peng Shi

State-of-the-art approaches to spelling error correction problem include Transformer-based Seq2Seq models, which require large training sets and suffer from slow inference time; and sequence labeling models based on Transformer encoders like BERT, which involve token-level label space and therefore a large pre-defined vocabulary dictionary.

Language Modelling

Prefix-to-SQL: Text-to-SQL Generation from Incomplete User Questions

no code implementations15 Sep 2021 Naihao Deng, Shuaichen Chang, Peng Shi, Tao Yu, Rui Zhang

Existing text-to-SQL research only considers complete questions as the input, but lay-users might strive to formulate a complete question.


Mr. TyDi: A Multi-lingual Benchmark for Dense Retrieval

1 code implementation EMNLP (MRL) 2021 Xinyu Zhang, Xueguang Ma, Peng Shi, Jimmy Lin

We present Mr. TyDi, a multi-lingual benchmark dataset for mono-lingual retrieval in eleven typologically diverse languages, designed to evaluate ranking with learned dense representations.

Representation Learning

Logic-Consistency Text Generation from Semantic Parses

1 code implementation Findings (ACL) 2021 Chang Shu, Yusen Zhang, Xiangyu Dong, Peng Shi, Tao Yu, Rui Zhang

Text generation from semantic parses is to generate textual descriptions for formal representation inputs such as logic forms and SQL queries.

Text Generation

End-to-End Cross-Domain Text-to-SQL Semantic Parsing with Auxiliary Task

no code implementations17 Jun 2021 Peng Shi, Tao Yu, Patrick Ng, Zhiguo Wang

Furthermore, we propose two value filling methods to build the bridge from the existing zero-shot semantic parsers to real-world applications, considering most of the existing parsers ignore the values filling in the synthesized SQL.

Semantic Parsing Text-To-Sql

Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training

3 code implementations18 Dec 2020 Peng Shi, Patrick Ng, Zhiguo Wang, Henghui Zhu, Alexander Hanbo Li, Jun Wang, Cicero Nogueira dos santos, Bing Xiang

Most recently, there has been significant interest in learning contextual representations for various NLP tasks, by leveraging large scale text corpora to train large neural language models with self-supervised learning objectives, such as Masked Language Model (MLM).

Language Modelling Self-Supervised Learning +2

Cross-Lingual Training of Neural Models for Document Ranking

no code implementations Findings of the Association for Computational Linguistics 2020 Peng Shi, He Bai, Jimmy Lin

We tackle the challenge of cross-lingual training of neural document ranking models for mono-lingual retrieval, specifically leveraging relevance judgments in English to improve search in non-English languages.

Document Ranking

Did You Ask a Good Question? A Cross-Domain Question Intention Classification Benchmark for Text-to-SQL

1 code implementation23 Oct 2020 Yusen Zhang, Xiangyu Dong, Shuaichen Chang, Tao Yu, Peng Shi, Rui Zhang

Neural models have achieved significant results on the text-to-SQL task, in which most current work assumes all the input questions are legal and generates a SQL query for any input.


Derivation of Elastic Wave Equation from New Motion Description

no code implementations27 May 2020 Peng Shi

In classical mechanics, the motion of an object is described with Newton's three laws of motion, which means that the motion of the material elements composing a continuum can be described with the particle model.

Classical Physics Materials Science Fluid Dynamics

Segatron: Segment-Aware Transformer for Language Modeling and Understanding

1 code implementation30 Apr 2020 He Bai, Peng Shi, Jimmy Lin, Yuqing Xie, Luchen Tan, Kun Xiong, Wen Gao, Ming Li

To verify this, we propose a segment-aware Transformer (Segatron), by replacing the original token position encoding with a combined position encoding of paragraph, sentence, and token.

Language Modelling Masked Language Modeling +1

Cross-Lingual Relevance Transfer for Document Retrieval

no code implementations8 Nov 2019 Peng Shi, Jimmy Lin

Recent work has shown the surprising ability of multi-lingual BERT to serve as a zero-shot cross-lingual transfer model for a number of language processing tasks.

Zero-Shot Cross-Lingual Transfer

Simple Attention-Based Representation Learning for Ranking Short Social Media Posts

no code implementations NAACL 2019 Peng Shi, Jinfeng Rao, Jimmy Lin

This paper explores the problem of ranking short social media posts with respect to user queries using neural networks.

Representation Learning

Farewell Freebase: Migrating the SimpleQuestions Dataset to DBpedia

1 code implementation COLING 2018 Michael Azmy, Peng Shi, Jimmy Lin, Ihab Ilyas

To address this problem, we present SimpleDBpediaQA, a new benchmark dataset for simple question answering over knowledge graphs that was created by mapping SimpleQuestions entities and predicates from Freebase to DBpedia.

Knowledge Graphs Question Answering +1

Strong Baselines for Simple Question Answering over Knowledge Graphs with and without Neural Networks

no code implementations NAACL 2018 Salman Mohammed, Peng Shi, Jimmy Lin

We examine the problem of question answering over knowledge graphs, focusing on simple questions that can be answered by the lookup of a single fact.

Entity Linking Knowledge Graphs +1

Cannot find the paper you are looking for? You can Submit a new open access paper.