2016
DOI: 10.48550/arxiv.1611.00020
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 47 publications
(44 citation statements)
references
References 19 publications
0
44
0
Order By: Relevance
“…c) Neural-Symbolic Models: To combine the advantages and circumvent the shortcomings of both symbolic and neural methods, neural-symbolic models which integrate symbolic logic and neural representation are widely studied [69,70,71,72,73]. Some work employs a neural module to parse the language into executable programs [74,18] and deterministically executing the programs to find the answer in a symbolic module. For example, the neural-symbolic models for numerical reasoning translate the input texts into expressions through a neural generation model and then discretely execute the expressions [43,46,47].…”
Section: B Advanced Methods Of Complex Reasoningmentioning
confidence: 99%
See 2 more Smart Citations
“…c) Neural-Symbolic Models: To combine the advantages and circumvent the shortcomings of both symbolic and neural methods, neural-symbolic models which integrate symbolic logic and neural representation are widely studied [69,70,71,72,73]. Some work employs a neural module to parse the language into executable programs [74,18] and deterministically executing the programs to find the answer in a symbolic module. For example, the neural-symbolic models for numerical reasoning translate the input texts into expressions through a neural generation model and then discretely execute the expressions [43,46,47].…”
Section: B Advanced Methods Of Complex Reasoningmentioning
confidence: 99%
“…We design a set of trigger words to match potential functions [74] and extract arguments (i.e., participants, positions, and numbers) according to their relative positions to the trigger words. For uncertain sentences with no matched function, we also build a neural classification model based on RoBERTa to predict their corresponding function types.…”
Section: B Symbolic Model: Armmentioning
confidence: 99%
See 1 more Smart Citation
“…But how to combine it with embedding and deep learning techniques need to be concerned. Some studies [9,60,61] have discussed in this topic.…”
Section: Complementary Of Knowledgementioning
confidence: 99%
“…Symbolic reasoning [7] and deep learning [8] based inference are not compatible in early years. However, since the development of embedded representation methods and deep learning, it become more and more applicable [9]. In other words, that has established a bridge for the fusion of the above two technologies, and can make comprehensive use of the advantages of both sides.…”
Section: Introductionmentioning
confidence: 99%