2020
DOI: 10.48550/arxiv.2002.05707
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Framework for End-to-End Learning on Semantic Tree-Structured Data

William Woof,
Ke Chen

Abstract: While learning models are typically studied for inputs in the form of a fixed dimensional feature vector, real world data is rarely found in this form. In order to meet the basic requirement of traditional learning models, structural data generally have to be converted into fix-length vectors in a handcrafted manner, which is tedious and may even incur information loss. A common form of structured data is what we term "semantic tree-structures", corresponding to data where rich semantic information is encoded … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…During the last stages of our research, we came to know about the work of Woof and Chen who provide a framework for end-to-end learning on treestructured data [19]. Their work possesses some parallels with ours, however differs on critical aspects.…”
Section: Related Work and Own Contributionmentioning
confidence: 99%
“…During the last stages of our research, we came to know about the work of Woof and Chen who provide a framework for end-to-end learning on treestructured data [19]. Their work possesses some parallels with ours, however differs on critical aspects.…”
Section: Related Work and Own Contributionmentioning
confidence: 99%
“…Tai et al (2015) introduced Tree-LSTM for computing tree embeddings bottom-up. It has then been applied for many tasks, including computer program translation , semantic tree structure learning (such as JSON or XML) (Woof and Chen, 2020) and supervised KG-QA tasks (Tong et al, 2019;Zafar et al, 2019;Athreya et al, 2021). In the latter context, Tree-LSTM is used to model the syntactic structure of the question.…”
Section: Related Workmentioning
confidence: 99%
“…Tai et al (2015) introduced Tree-LSTM for computing tree embeddings bottom-up. It has then been applied for many tasks, including computer program translation , se-mantic tree structure learning (such as JSON or XML) (Woof and Chen, 2020) and supervised KG-QA tasks (Tong et al, 2019;Zafar et al, 2019;Athreya et al, 2020). In the latter context, Tree-LSTM is used to model the syntactic structure of the question.…”
Section: Related Workmentioning
confidence: 99%