Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.348
|View full text |Cite
|
Sign up to set email alerts
|

Mathematical Word Problem Generation from Commonsense Knowledge Graph and Equations

Abstract: There is an increasing interest in the use of mathematical word problem (MWP) generation in educational assessment. Different from standard natural question generation, MWP generation needs to maintain the underlying mathematical operations between quantities and variables, while at the same time ensuring the relevance between the output and the given topic. To address above problem, we develop an end-to-end neural model to generate diverse MWPs in real-world scenarios from commonsense knowledge graph and equa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(3 citation statements)
references
References 37 publications
(27 reference statements)
0
1
0
Order By: Relevance
“…Following the same shift as in AQG, Zhou and Huang (2019) proposed an approach using Recurrent Neural Networks (RNNs) that encodes math expressions and topic words to automatically generate such problems. Subsequent research along this direction has focused on improving topic relevance, expression relevance, language coherence, as well as completeness and validity of the generated problems using a spectrum of approaches (Liu et al, 2021;Wang et al, 2021;Wu et al, 2022).…”
Section: Content Generationmentioning
confidence: 99%
“…Following the same shift as in AQG, Zhou and Huang (2019) proposed an approach using Recurrent Neural Networks (RNNs) that encodes math expressions and topic words to automatically generate such problems. Subsequent research along this direction has focused on improving topic relevance, expression relevance, language coherence, as well as completeness and validity of the generated problems using a spectrum of approaches (Liu et al, 2021;Wang et al, 2021;Wu et al, 2022).…”
Section: Content Generationmentioning
confidence: 99%
“…Figure 1 gives an illustrative example of the KT task. Such predictive capabilities can potentially help students learn better and faster when paired with high-quality learning materials and instructions and the KT models have been widely used to support intelligent tutoring systems and MOOC platforms (Käser et al 2017;Cen, Koedinger, and Junker 2006;Lavoué et al 2018;Liu et al 2021a).…”
Section: Introductionmentioning
confidence: 99%
“…Figure 1 gives an illustrative example of the KT task. Such predictive capabilities can potentially help students learn better and faster when paired with high-quality learning materials and instructions and the KT models have been widely used to support intelligent tutoring systems and MOOC platforms (Käser et al 2017;Cen, Koedinger, and Junker 2006;Lavoué et al 2018;Liu et al 2021a). Shen et al 2021Shen et al , 2020Yang et al 2020;Zhang et al 2017Zhang et al , 2021Wang et al 2019;Liu et al 2023a,b).…”
Section: Introductionmentioning
confidence: 99%