Search Results for author: Pengcheng Yin

Found 25 papers, 15 papers with code

Compositional Generalization and Decomposition in Neural Program Synthesis

no code implementations7 Apr 2022 Kensen Shi, Joey Hong, Manzil Zaheer, Pengcheng Yin, Charles Sutton

We first characterize several different axes along which program synthesis methods would be desired to generalize, e. g., length generalization, or the ability to combine known subroutines in new ways that do not occur in the training data.

Program Synthesis

Show Me More Details: Discovering Hierarchies of Procedures from Semi-structured Web Data

1 code implementation ACL 2022 Shuyan Zhou, Li Zhang, Yue Yang, Qing Lyu, Pengcheng Yin, Chris Callison-Burch, Graham Neubig

To this end, we develop a simple and efficient method that links steps (e. g., "purchase a camera") in an article to other articles with similar goals (e. g., "how to choose a camera"), recursively constructing the KB.

Video Retrieval

Learning to Superoptimize Real-world Programs

no code implementations28 Sep 2021 Alex Shypula, Pengcheng Yin, Jeremy Lacomis, Claire Le Goues, Edward Schwartz, Graham Neubig

We also report that SILO's rate of superoptimization on our test set is over five times that of a standard policy gradient approach and a model pre-trained on compiler optimization demonstration.

Imitation Learning

Hierarchical Control of Situated Agents through Natural Language

no code implementations16 Sep 2021 Shuyan Zhou, Pengcheng Yin, Graham Neubig

When humans conceive how to perform a particular task, they do so hierarchically: splitting higher-level tasks into smaller sub-tasks.

Learning Structural Edits via Incremental Tree Transformations

1 code implementation ICLR 2021 Ziyu Yao, Frank F. Xu, Pengcheng Yin, Huan Sun, Graham Neubig

To show the unique benefits of modeling tree edits directly, we further propose a novel edit encoder for learning to represent edits, as well as an imitation learning method that allows the editor to be more robust.

Imitation Learning

Incorporating External Knowledge through Pre-training for Natural Language to Code Generation

2 code implementations ACL 2020 Frank F. Xu, Zhengbao Jiang, Pengcheng Yin, Bogdan Vasilescu, Graham Neubig

Open-domain code generation aims to generate code in a general-purpose programming language (such as Python) from natural language (NL) intents.

Ranked #3 on Code Generation on CoNaLa (using extra training data)

Code Generation Data Augmentation

Merging Weak and Active Supervision for Semantic Parsing

1 code implementation29 Nov 2019 Ansong Ni, Pengcheng Yin, Graham Neubig

Experiments on WikiTableQuestions with human annotators show that our method can improve the performance with only 100 active queries, especially for weakly-supervised parsers learnt from a cold start.

Active Learning Semantic Parsing

Reranking for Neural Semantic Parsing

no code implementations ACL 2019 Pengcheng Yin, Graham Neubig

Semantic parsing considers the task of transducing natural language (NL) utterances into machine executable meaning representations (MRs).

Code Generation Semantic Parsing

Improving Open Information Extraction via Iterative Rank-Aware Learning

1 code implementation ACL 2019 Zhengbao Jiang, Pengcheng Yin, Graham Neubig

We found that the extraction likelihood, a confidence measure used by current supervised open IE systems, is not well calibrated when comparing the quality of assertions extracted from different sentences.

General Classification Open Information Extraction

TRANX: A Transition-based Neural Abstract Syntax Parser for Semantic Parsing and Code Generation

4 code implementations EMNLP 2018 Pengcheng Yin, Graham Neubig

We present TRANX, a transition-based neural semantic parser that maps natural language (NL) utterances into formal meaning representations (MRs).

Code Generation Semantic Parsing

Retrieval-Based Neural Code Generation

1 code implementation EMNLP 2018 Shirley Anugrah Hayati, Raphael Olivier, Pravalika Avvaru, Pengcheng Yin, Anthony Tomasic, Graham Neubig

In models to generate program source code from natural language, representing this code in a tree structure has been a common approach.

Code Generation Sentence Similarity

A Tree-based Decoder for Neural Machine Translation

1 code implementation EMNLP 2018 Xinyi Wang, Hieu Pham, Pengcheng Yin, Graham Neubig

Recent advances in Neural Machine Translation (NMT) show that adding syntactic information to NMT systems can improve the quality of their translations.

Machine Translation Translation

StructVAE: Tree-structured Latent Variable Models for Semi-supervised Semantic Parsing

6 code implementations ACL 2018 Pengcheng Yin, Chunting Zhou, Junxian He, Graham Neubig

Semantic parsing is the task of transducing natural language (NL) utterances into formal meaning representations (MRs), commonly represented as tree structures.

Code Generation Semantic Parsing

Learning to Mine Aligned Code and Natural Language Pairs from Stack Overflow

no code implementations23 May 2018 Pengcheng Yin, Bowen Deng, Edgar Chen, Bogdan Vasilescu, Graham Neubig

For tasks like code synthesis from natural language, code retrieval, and code summarization, data-driven models have shown great promise.

Code Summarization Source Code Summarization

Softmax Q-Distribution Estimation for Structured Prediction: A Theoretical Interpretation for RAML

no code implementations ICLR 2018 Xuezhe Ma, Pengcheng Yin, Jingzhou Liu, Graham Neubig, Eduard Hovy

Reward augmented maximum likelihood (RAML), a simple and effective learning framework to directly optimize towards the reward function in structured prediction tasks, has led to a number of impressive empirical successes.

Dependency Parsing Image Captioning +4

A Syntactic Neural Model for General-Purpose Code Generation

6 code implementations ACL 2017 Pengcheng Yin, Graham Neubig

We consider the problem of parsing natural language descriptions into source code written in a general-purpose programming language like Python.

Code Generation Semantic Parsing +1

DyNet: The Dynamic Neural Network Toolkit

4 code implementations15 Jan 2017 Graham Neubig, Chris Dyer, Yoav Goldberg, Austin Matthews, Waleed Ammar, Antonios Anastasopoulos, Miguel Ballesteros, David Chiang, Daniel Clothiaux, Trevor Cohn, Kevin Duh, Manaal Faruqui, Cynthia Gan, Dan Garrette, Yangfeng Ji, Lingpeng Kong, Adhiguna Kuncoro, Gaurav Kumar, Chaitanya Malaviya, Paul Michel, Yusuke Oda, Matthew Richardson, Naomi Saphra, Swabha Swayamdipta, Pengcheng Yin

In the static declaration strategy that is used in toolkits like Theano, CNTK, and TensorFlow, the user first defines a computation graph (a symbolic representation of the computation), and then examples are fed into an engine that executes this computation and computes its derivatives.

graph construction

Neural Enquirer: Learning to Query Tables with Natural Language

no code implementations3 Dec 2015 Pengcheng Yin, Zhengdong Lu, Hang Li, Ben Kao

Neural Enquirer can be trained with gradient descent, with which not only the parameters of the controlling components and semantic parsing component, but also the embeddings of the tables and query words can be learned from scratch.

Semantic Parsing

Cannot find the paper you are looking for? You can Submit a new open access paper.