2021
DOI: 10.1007/978-3-030-81097-9_11
|View full text |Cite
|
Sign up to set email alerts
|

Searching for Mathematical Formulas Based on Graph Representation Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…More recently, data-driven approaches which incorporate classic word embeddings (Gao et al, 2017;Mansouri et al, 2019), GNN (Song and Chen, 2021) or transformer models Reusch et al, 2021a,b) have also been proposed for the MIR domain. By observing token co-occurrence during training, these models can easily discover synonyms, equivalent mathematically transformed formulas, or high-level semantic similarities, making them a good enhancement for structure search approaches.…”
Section: Data-driven Methodsmentioning
confidence: 99%
“…More recently, data-driven approaches which incorporate classic word embeddings (Gao et al, 2017;Mansouri et al, 2019), GNN (Song and Chen, 2021) or transformer models Reusch et al, 2021a,b) have also been proposed for the MIR domain. By observing token co-occurrence during training, these models can easily discover synonyms, equivalent mathematically transformed formulas, or high-level semantic similarities, making them a good enhancement for structure search approaches.…”
Section: Data-driven Methodsmentioning
confidence: 99%
“…For math word problems, RNN [3] and Transformer [18] have been utilized to encode the mathematical text and generate the math equation. For mathematical information retrieval, graph neural networks [30] have been adopted to learn meaningful representations over the structured formulas. Recently, the success of PLMs [22,24] pushes forward the understanding and modeling of mathematical texts, due to the excellent capacity of language modeling.…”
Section: Related Workmentioning
confidence: 99%