( Image credit: SyntaxSQLNet )
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Existing state-of-the-art approaches rely on reinforcement learning to reward the decoder when it generates any of the equivalent serializations.
SOTA for Text-To-Sql on WikiSQL
A significant amount of the world's knowledge is stored in relational databases.
We define a new complex and cross-domain semantic parsing and text-to-SQL task where different complex SQL queries and databases appear in train and test sets.
SOTA for Semantic Parsing on spider
Second, we show that the current division of data into training and test sets measures robustness to variations in the way questions are asked, but only partially tests how well systems generalize to new queries; therefore, we propose a complementary dataset split for evaluation of future work.
In this paper we propose SyntaxSQLNet, a syntax tree network to address the complex and cross-domain text-to-SQL generation task.
We present a neural approach called IRNet for complex and cross-domain Text-to-SQL.
Interacting with relational databases through natural language helps users of any background easily query and analyze a vast amount of data.
We present a simple methods to leverage the table content for the BERT-based model to solve the text-to-SQL problem.
As a promising paradigm, interactive semantic parsing has shown to improve both semantic parsing accuracy and user confidence in the results.