no code implementations • COLING 2022 • Octavian Popescu, Irene Manotas, Ngoc Phuoc An Vo, Hangu Yeo, Elahe Khorashani, Vadim Sheinin
Most attempts on Text-to-SQL task using encoder-decoder approach show a big problem of dramatic decline in performance for new databases.
no code implementations • 10 May 2024 • Ngoc Phuoc An Vo, Brent Paulovicks, Vadim Sheinin
We create a set of 50 prompts to evaluate some popular LLMs for NL2Bash.
no code implementations • 9 Dec 2023 • Irene Manotas, Octavian Popescu, Ngoc Phuoc An Vo, Vadim Sheinin
There are many recent advanced developments for the Text-to-SQL task, where the Picard model is one of the the top performing models as measured by the Spider dataset competition.
no code implementations • RANLP 2021 • Ngoc Phuoc An Vo, Irene Manotas, Octavian Popescu, Algimantas Cerniauskas, Vadim Sheinin
Business Process Management (BPM) is the discipline which is responsible for management of discovering, analyzing, redesigning, monitoring, and controlling business processes.
no code implementations • COLING 2020 • Ngoc Phuoc An Vo, Irene Manotas, Vadim Sheinin, Octavian Popescu
Motion recognition is one of the basic cognitive capabilities of many life forms, however, detecting and understanding motion in text is not a trivial task.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Irene Manotas, Ngoc Phuoc An Vo, Vadim Sheinin
Motion recognition is one of the basic cognitive capabilities of many life forms, yet identifying motion of physical entities in natural language have not been explored extensively and empirically.
1 code implementation • EMNLP 2018 • Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Vadim Sheinin
Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq models, which may not fully capture the inherent graph-structured information in SQL query.
1 code implementation • EMNLP 2018 • Kun Xu, Lingfei Wu, Zhiguo Wang, Mo Yu, Li-Wei Chen, Vadim Sheinin
Existing neural semantic parsers mainly utilize a sequence encoder, i. e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency graph or constituent trees.
4 code implementations • ICLR 2019 • Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock, Vadim Sheinin
Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.
Ranked #1 on SQL-to-Text on WikiSQL