2018
DOI: 10.48550/arxiv.1809.02700
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Textual Analogy Parsing: What's Shared and What's Compared among Analogous Facts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…For sequence tagging techniques, rule-based methods [52] are straightforward, but costly and limited. There are also many machine learning methods proposed which are more robust and easier to generalize, including Hidden Markov Models (HMM) [9], Support Vector Machines (SVM) [4] and Conditional Random Fields (CRF) [34].…”
Section: Natural Language Interactionsmentioning
confidence: 99%
“…For sequence tagging techniques, rule-based methods [52] are straightforward, but costly and limited. There are also many machine learning methods proposed which are more robust and easier to generalize, including Hidden Markov Models (HMM) [9], Support Vector Machines (SVM) [4] and Conditional Random Fields (CRF) [34].…”
Section: Natural Language Interactionsmentioning
confidence: 99%
“…Last, NLP could automatize the extraction of labels by parsing the relation between the number and the label. Such a system could draw from approaches to parse textual analogies [36,77]. However, these remain challenging even with modern techniques (see the difficulties faced by approaches to automatically connect text and chart [76,113]).…”
Section: Future Workmentioning
confidence: 99%