2018
DOI: 10.48550/arxiv.1809.02840
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Guided Constraint Logic Programming for Program Synthesis

Abstract: Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
2
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…ese conde idioms comprise a large portion of the miniKanren work implied here, and their size could grow very quickly over time. is leads to performance questions that are possibly answered by work on guided search (Swords and Friedman;Zhang et al 2018) and discerning conde branch selection (Boskin et al 2018).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…ese conde idioms comprise a large portion of the miniKanren work implied here, and their size could grow very quickly over time. is leads to performance questions that are possibly answered by work on guided search (Swords and Friedman;Zhang et al 2018) and discerning conde branch selection (Boskin et al 2018).…”
mentioning
confidence: 99%
“…Finally, we would like to point out the potential for an exciting "feedback loop": as statistical modeling improves the processing of miniKanren (Zhang et al 2018), miniKanren can also improve the process of statistical modeling.…”
mentioning
confidence: 99%