Search Results for author: Shuaichen Chang

Found 8 papers, 6 papers with code

Selective Demonstrations for Cross-domain Text-to-SQL

1 code implementation10 Oct 2023 Shuaichen Chang, Eric Fosler-Lussier

Large language models (LLMs) with in-context learning have demonstrated impressive generalization capabilities in the cross-domain text-to-SQL task, without the use of in-domain annotations.

In-Context Learning Text-To-SQL

How to Prompt LLMs for Text-to-SQL: A Study in Zero-shot, Single-domain, and Cross-domain Settings

1 code implementation19 May 2023 Shuaichen Chang, Eric Fosler-Lussier

Large language models (LLMs) with in-context learning have demonstrated remarkable capability in the text-to-SQL task.

In-Context Learning Retrieval +1

MapQA: A Dataset for Question Answering on Choropleth Maps

1 code implementation15 Nov 2022 Shuaichen Chang, David Palzer, Jialin Li, Eric Fosler-Lussier, Ningchuan Xiao

Our experimental results show that V-MODEQA has better overall performance and robustness on MapQA than the state-of-the-art ChartQA and VQA algorithms by capturing the unique properties in map question answering.

Question Answering Visual Question Answering

Prefix-to-SQL: Text-to-SQL Generation from Incomplete User Questions

no code implementations15 Sep 2021 Naihao Deng, Shuaichen Chang, Peng Shi, Tao Yu, Rui Zhang

Existing text-to-SQL research only considers complete questions as the input, but lay-users might strive to formulate a complete question.


Did You Ask a Good Question? A Cross-Domain Question Intention Classification Benchmark for Text-to-SQL

1 code implementation23 Oct 2020 Yusen Zhang, Xiangyu Dong, Shuaichen Chang, Tao Yu, Peng Shi, Rui Zhang

Neural models have achieved significant results on the text-to-SQL task, in which most current work assumes all the input questions are legal and generates a SQL query for any input.


Zero-shot Text-to-SQL Learning with Auxiliary Task

1 code implementation29 Aug 2019 Shuaichen Chang, PengFei Liu, Yun Tang, Jing Huang, Xiaodong He, Bo-Wen Zhou

Recent years have seen great success in the use of neural seq2seq models on the text-to-SQL task.


Contextualized Non-local Neural Networks for Sequence Learning

no code implementations21 Nov 2018 Pengfei Liu, Shuaichen Chang, Xuanjing Huang, Jian Tang, Jackie Chi Kit Cheung

Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which self-attention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention.

General Classification Sentence +2

Cannot find the paper you are looking for? You can Submit a new open access paper.