2022
DOI: 10.1038/s41467-022-32012-w
|View full text |Cite
|
Sign up to set email alerts
|

Synthesizing theories of human language with Bayesian program induction

Abstract: Automated, data-driven construction and evaluation of scientific models and theories is a long-standing challenge in artificial intelligence. We present a framework for algorithmically synthesizing models of a basic part of human language: morpho-phonology, the system that builds word forms from sounds. We integrate Bayesian inference with program synthesis and representations inspired by linguistic theory and cognitive models of learning and discovery. Across 70 datasets from 58 diverse languages, our system … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 33 publications
0
14
0
Order By: Relevance
“…While neurosymbolic representations are always structured, not all structure‐aware representations fall under our definition of a “symbolic program”.As such, our framework does not capture some approaches that represent visual data as sequences of discrete codes [YLM*22], or those that simply combine primitives or [TSG*17] learned parts [PKGF21]. In later sections, we discuss methods for visual reconstruction tasks that use learning methods to produce symbolic representations that, when executed, generate visual outputs that reconstruct the input [TLS*19, ENP*19, ERSLT18, JWR22].…”
Section: Background and Scopementioning
confidence: 99%
See 1 more Smart Citation
“…While neurosymbolic representations are always structured, not all structure‐aware representations fall under our definition of a “symbolic program”.As such, our framework does not capture some approaches that represent visual data as sequences of discrete codes [YLM*22], or those that simply combine primitives or [TSG*17] learned parts [PKGF21]. In later sections, we discuss methods for visual reconstruction tasks that use learning methods to produce symbolic representations that, when executed, generate visual outputs that reconstruct the input [TLS*19, ENP*19, ERSLT18, JWR22].…”
Section: Background and Scopementioning
confidence: 99%
“… Neurosymbolic models produce visual data via a combination of symbolic programs and machine learning. From left to right: outputs of CAD programs written by a neural network [XWL*22]; inferring a 2D drawing program that reproduces an input hand‐drawn diagram [ERSLT18]; procedural material programs (i.e. node graphs) generated by a neural network [GHS*22].…”
Section: Introductionmentioning
confidence: 99%
“…Besides ILP, inductive program synthesis has been studied in many areas of ML, including deep learning (Balog et al, 2017;Ellis et al, 2018Ellis et al, , 2019. The main advantages of neural approaches are that they can handle noisy BK, as illustrated by ∂ILP, and can harness tremendous computational power (Ellis et al, 2019).…”
Section: Program Synthesismentioning
confidence: 99%
“…45. Inductive program synthesis is often called program induction (Lin et al, 2014;Lake et al, 2015;Cropper, 2017;Ellis et al, 2018), programming by example (Lieberman, 2001), and inductive programming (Gulwani et al, 2015), amongst many other names.…”
Section: Program Synthesismentioning
confidence: 99%
See 1 more Smart Citation