Search Results for author: Kensen Shi

Found 9 papers, 6 papers with code

Compositional Generalization and Decomposition in Neural Program Synthesis

no code implementations7 Apr 2022 Kensen Shi, Joey Hong, Manzil Zaheer, Pengcheng Yin, Charles Sutton

We first characterize several different axes along which program synthesis methods would be desired to generalize, e. g., length generalization, or the ability to combine known subroutines in new ways that do not occur in the training data.

Program Synthesis

CrossBeam: Learning to Search in Bottom-Up Program Synthesis

1 code implementation ICLR 2022 Kensen Shi, Hanjun Dai, Kevin Ellis, Charles Sutton

Many approaches to program synthesis perform a search within an enormous space of programs to find one that satisfies a given specification.

Program Synthesis Structured Prediction

TF-Coder: Program Synthesis for Tensor Manipulations

2 code implementations NeurIPS Workshop CAP 2020 Kensen Shi, David Bieber, Rishabh Singh

The success and popularity of deep learning is on the rise, partially due to powerful deep learning frameworks such as TensorFlow and PyTorch that make it easier to develop deep learning models.

Enumerative Search

Incremental Sampling Without Replacement for Sequence Models

1 code implementation ICML 2020 Kensen Shi, David Bieber, Charles Sutton

Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial.

Combinatorial Optimization Program Synthesis

Learning and Evaluating Contextual Embedding of Source Code

2 code implementations ICML 2020 Aditya Kanade, Petros Maniatis, Gogul Balakrishnan, Kensen Shi

We fine-tune CuBERT on our benchmark tasks, and compare the resulting models to different variants of Word2Vec token embeddings, BiLSTM and Transformer models, as well as published state-of-the-art models, showing that CuBERT outperforms them all, even with shorter training, and with fewer labeled examples.

Contextual Embedding for Source Code Exception type +5

Pre-trained Contextual Embedding of Source Code

no code implementations25 Sep 2019 Aditya Kanade, Petros Maniatis, Gogul Balakrishnan, Kensen Shi

A major advancement in natural-language understanding has been the use of pre-trained token embeddings; BERT and other works have further shown that pre-trained contextual embeddings can be extremely powerful and can be finetuned effectively for a variety of downstream supervised tasks.

Natural Language Processing Natural Language Understanding

FrAngel: Component-Based Synthesis with Control Structures

2 code implementations13 Nov 2018 Kensen Shi, Jacob Steinhardt, Percy Liang

We present FrAngel, a new approach to component-based synthesis that can synthesize short Java functions with control structures when given a desired signature, a set of input-output examples, and a collection of libraries (without formal specifications).

Programming Languages

Cannot find the paper you are looking for? You can Submit a new open access paper.