2020
DOI: 10.48550/arxiv.2006.08381
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DreamCoder: Growing generalizable, interpretable knowledge with wake-sleep Bayesian program learning

Abstract: Expert problem-solving is driven by powerful languages for thinking about problems and their solutions. Acquiring expertise means learning these languagessystems of concepts, alongside the skills to use them. We present DreamCoder, a system that learns to solve problems by writing programs. It builds expertise by creating programming languages for expressing domain concepts, together with neural networks to guide the search for programs within these languages. A "wake-sleep" learning algorithm alternately exte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
62
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 35 publications
(64 citation statements)
references
References 17 publications
0
62
0
Order By: Relevance
“…Relying on the built-in method for program search, it allows us to efficiently search promising type-safe differentiable program candidates. Compared to other program synthesis languages [48][49][50][51] , HOUDINI rules out the error-prone functions that undermine the software safety and presents itself as an ideal candidate for our task (See Supplementary Tab. 7 for further comparison).…”
Section: Type-safe Program Synthesismentioning
confidence: 99%
See 1 more Smart Citation
“…Relying on the built-in method for program search, it allows us to efficiently search promising type-safe differentiable program candidates. Compared to other program synthesis languages [48][49][50][51] , HOUDINI rules out the error-prone functions that undermine the software safety and presents itself as an ideal candidate for our task (See Supplementary Tab. 7 for further comparison).…”
Section: Type-safe Program Synthesismentioning
confidence: 99%
“…Task Typed Functional Code availability NTPT 48 Misc. NS-CL 49 VQA Prob-NMN 50 VQA DreamCoder 51 Misc. HOUDINI 42 Misc.…”
Section: Data Availabilitymentioning
confidence: 99%
“…Multiple works train a neural network and then analyze it to get information, such as inferring missing objects via back-propagation descent [2], inspecting its gradients to guide a program synthesis approach [51], learning a blue-print with a GNN [12], or leveraging overparametrized-but-regularized auto-encoders [52]. Others, such as DreamCoder [16], take the explicit approach with a neural guided synthesis over the space of formulas. Unlike these works, our method can readily be applied both to symbolic conservation laws and to neural network conservation losses applied to raw videos.…”
Section: Related Workmentioning
confidence: 99%
“…It is worth noting that other datasets from Greydanus et al [22] (such as a planetary system) could not be included because the DSL required too much depth to reach the conserved energy quantity. A better search, such as using evolution, or a better DSL (such as those derived from many scientific formulas in DreamCoder [16]) could remedy this. Finally, note that an approach that searched over the same DSL encoding the loss as a generic f (x 0 , x t ) loss, not as a conservation |g(x 0 ) − g(x t )| would require more than twice the depth, thus not being able to cover the evaluated datasets.…”
Section: Experimental Details For Scientific Datamentioning
confidence: 99%
“…This has led to a number of interesting applications, and deep-learning models are now successfully and routinely applied in tools that assist developers in writing and understanding programs and code. For instance, neural language models can synthesize (Gulwani et al, 2017;Ellis et al, 2020), complete (Chen et al, 2021), and summarize programs (Elnaggar et al, 2021), whether they are written in mainstream languages like Python and Java, or in domain-specific languages like SQL and regex.…”
Section: Introductionmentioning
confidence: 99%