2020
DOI: 10.48550/arxiv.2004.13781
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph-to-Tree Neural Networks for Learning Structured Input-Output Translation with Applications to Semantic Parsing and Math Word Problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…(1) Constituency Graph: This is one of the more popular graph-construction methods that captures phrase-based syntactic relationships in sentences. It follows the relationships of phrase structure syntax rather than dependencies and dependent syntax [45].…”
Section: ) Evaluation Methods and Metricsmentioning
confidence: 99%
“…(1) Constituency Graph: This is one of the more popular graph-construction methods that captures phrase-based syntactic relationships in sentences. It follows the relationships of phrase structure syntax rather than dependencies and dependent syntax [45].…”
Section: ) Evaluation Methods and Metricsmentioning
confidence: 99%
“…stack) to align the source and target based on Seq2Seq model. [Li et al, 2019] brought in attention mechanism and adopted a two-stage way to generate the sequential expressions.…”
Section: Related Workmentioning
confidence: 99%
“…[Zhang et al, 2020a] utilized knowledge distillation and multiple decoders to generate diverse expressions. [Wu et al, 2020] introduced common sense knowledge into Seq2Tree based generator. And [Wu et al, 2021] explicitly encoded the numeric value of numbers into the Seq2Tree model.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations