Search Results for author: Vadim Sheinin

Found 11 papers, 4 papers with code

Tackling Execution-Based Evaluation for NL2Bash

no code implementations10 May 2024 Ngoc Phuoc An Vo, Brent Paulovicks, Vadim Sheinin

We create a set of 50 prompts to evaluate some popular LLMs for NL2Bash.

Code Generation

Domain Adaptation of a State of the Art Text-to-SQL Model: Lessons Learned and Challenges Found

no code implementations9 Dec 2023 Irene Manotas, Octavian Popescu, Ngoc Phuoc An Vo, Vadim Sheinin

There are many recent advanced developments for the Text-to-SQL task, where the Picard model is one of the the top performing models as measured by the Spider dataset competition.

Domain Adaptation Language Modelling +1

Recognizing and Splitting Conditional Sentences for Automation of Business Processes Management

no code implementations RANLP 2021 Ngoc Phuoc An Vo, Irene Manotas, Octavian Popescu, Algimantas Cerniauskas, Vadim Sheinin

Business Process Management (BPM) is the discipline which is responsible for management of discovering, analyzing, redesigning, monitoring, and controlling business processes.

Management Sentence

LiMiT: The Literal Motion in Text Dataset

1 code implementation Findings of the Association for Computational Linguistics 2020 Irene Manotas, Ngoc Phuoc An Vo, Vadim Sheinin

Motion recognition is one of the basic cognitive capabilities of many life forms, yet identifying motion of physical entities in natural language have not been explored extensively and empirically.

Diversity

SQL-to-Text Generation with Graph-to-Sequence Model

1 code implementation EMNLP 2018 Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Vadim Sheinin

Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq models, which may not fully capture the inherent graph-structured information in SQL query.

Graph-to-Sequence SQL-to-Text +1

Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model

1 code implementation EMNLP 2018 Kun Xu, Lingfei Wu, Zhiguo Wang, Mo Yu, Li-Wei Chen, Vadim Sheinin

Existing neural semantic parsers mainly utilize a sequence encoder, i. e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency graph or constituent trees.

Graph-to-Sequence Semantic Parsing

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

4 code implementations ICLR 2019 Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock, Vadim Sheinin

Our method first generates the node and graph embeddings using an improved graph-based neural network with a novel aggregation strategy to incorporate edge direction information in the node embeddings.

Decoder Graph-to-Sequence +2

Cannot find the paper you are looking for? You can Submit a new open access paper.