Program Synthesis

136 papers with code • 3 benchmarks • 5 datasets

Program synthesis is the process of automatically generating a program or code snippet that satisfies a given specification or set of requirements. This can include generating code from a formal specification, a natural language description, or example inputs and outputs. The primary goal of program synthesis is to minimize human intervention in the coding process, reduce errors, and improve productivity.

Program synthesis often involves the use of advanced algorithms, artificial intelligence, and machine learning techniques to search the space of possible programs that meet the given constraints. This process can be guided by a variety of techniques, such as constraint solving, symbolic execution, and genetic algorithms.

Libraries

Use these libraries to find Program Synthesis models and implementations

Most implemented papers

CodeGen: An Open Large Language Model for Code with Multi-Turn Program Synthesis

salesforce/CodeGen 25 Mar 2022

To democratize this, we train and release a family of large language models up to 16. 1B parameters, called CODEGEN, on natural language and programming language data, and open source the training library JAXFORMER.

Neural Program Synthesis with Priority Queue Training

tensorflow/models 10 Jan 2018

Models and examples built with TensorFlow

Memory Augmented Policy Optimization for Program Synthesis and Semantic Parsing

crazydonkey200/neural-symbolic-machines NeurIPS 2018

We present Memory Augmented Policy Optimization (MAPO), a simple and novel way to leverage a memory buffer of promising trajectories to reduce the variance of policy gradient estimate.

DeepCoder: Learning to Write Programs

HiroakiMikami/deep-coder 7 Nov 2016

We develop a first line of attack for solving programming competition-style problems from input-output examples using deep learning.

RobustFill: Neural Program Learning under Noisy I/O

amitz25/PCCoder ICML 2017

Recently, two competing approaches for automatic program learning have received significant attention: (1) neural program synthesis, where a neural network is conditioned on input/output (I/O) examples and learns to generate a program, and (2) neural program induction, where a neural network generates new outputs directly using a latent program representation.

DreamCoder: Growing generalizable, interpretable knowledge with wake-sleep Bayesian program learning

CatherineWong/dreamcoder 15 Jun 2020

It builds expertise by creating programming languages for expressing domain concepts, together with neural networks to guide the search for programs within these languages.

Programming Puzzles

microsoft/PythonProgrammingPuzzles 10 Jun 2021

The dataset is comprehensive in that it spans problems of a range of difficulties and domains, ranging from trivial string manipulation problems, to classic programming puzzles (e. g., Tower of Hanoi), to interview/competitive-programming problems (e. g., dynamic programming), to longstanding open problems in algorithms and mathematics (e. g., factoring).

Learning Program Synthesis for Integer Sequences from Scratch

barakeel/oeis-synthesis 24 Feb 2022

We present a self-learning approach for synthesizing programs from integer sequences.

InCoder: A Generative Model for Code Infilling and Synthesis

dpfried/incoder 12 Apr 2022

Our model is the first generative model that is able to directly perform zero-shot code infilling, which we evaluate on challenging tasks such as type inference, comment generation, and variable re-naming.

HOUDINI: Lifelong Learning as Program Synthesis

capergroup/houdini NeurIPS 2018

We present a neurosymbolic framework for the lifelong learning of algorithmic tasks that mix perception and procedural reasoning.