no code implementations • 27 May 2023 • Sijia Wang, Alexander Hanbo Li, Henry Zhu, Sheng Zhang, Chung-Wei Hang, Pramuditha Perera, Jie Ma, William Wang, Zhiguo Wang, Vittorio Castelli, Bing Xiang, Patrick Ng
Entities can be expressed in diverse formats, such as texts, images, or column names and cell values in tables.
1 code implementation • 25 May 2023 • Wuwei Lan, Zhiguo Wang, Anuj Chauhan, Henghui Zhu, Alexander Li, Jiang Guo, Sheng Zhang, Chung-Wei Hang, Joseph Lilien, Yiqun Hu, Lin Pan, Mingwen Dong, Jun Wang, Jiarong Jiang, Stephen Ash, Vittorio Castelli, Patrick Ng, Bing Xiang
A practical text-to-SQL system should generalize well on a wide variety of natural language questions, unseen database schemas, and novel SQL query structures.
no code implementations • 17 Dec 2022 • Yiyun Zhao, Jiarong Jiang, Yiqun Hu, Wuwei Lan, Henry Zhu, Anuj Chauhan, Alexander Li, Lin Pan, Jun Wang, Chung-Wei Hang, Sheng Zhang, Marvin Dong, Joe Lilien, Patrick Ng, Zhiguo Wang, Vittorio Castelli, Bing Xiang
In this paper, we first examined the existing synthesized datasets and discovered that state-of-the-art text-to-SQL algorithms did not further improve on popular benchmarks when trained with augmented synthetic data.
no code implementations • 21 Jul 2021 • Lin Pan, Chung-Wei Hang, Avirup Sil, Saloni Potdar
We propose a simple and general method to regularize the fine-tuning of Transformer-based encoders for text classification tasks.
no code implementations • NAACL 2021 • Lin Pan, Chung-Wei Hang, Haode Qi, Abhishek Shah, Saloni Potdar, Mo Yu
We propose a simple method to align multilingual contextual embeddings as a post-pretraining step for improved zero-shot cross-lingual transferability of the pretrained models.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Zhe Zhang, Chung-Wei Hang, Munindar P. Singh
Sentiments in opinionated text are often determined by both aspects and target words (or targets).
no code implementations • 26 Oct 2015 • Yang Yu, Wei zhang, Chung-Wei Hang, Bing Xiang, Bo-Wen Zhou
In this paper we explore deep learning models with memory component or attention mechanism for question answering task.