2022
DOI: 10.1145/3490501
|View full text |Cite
|
Sign up to set email alerts
|

Q-Learning for Shift-Reduce Parsing in Indonesian Tree-LSTM-Based Text Generation

Abstract: Tree-LSTM algorithm accommodates tree structure processing to extract information outside the linear sequence pattern. The use of Tree-LSTM in text generation problems requires the help of an external parser at each generation iteration. Developing a good parser demands the representation of complex features and relies heavily on the grammar of the corpus. The limited corpus results in an insufficient number of vocabs for a grammar-based parser, making it less natural to link the text generation process. This … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 21 publications
0
5
0
Order By: Relevance
“…RP Hastuti and his team proposed a method to solve the limited corpus problem in their research [3], which involves using reinforcement learning algorithms in the formation of selection trees. This algorithm associates the given seed phrase with the input of the tree LSTM model during sentence generation, effectively addressing the challenges of limited corpus.…”
Section: Related Researchmentioning
confidence: 99%
“…RP Hastuti and his team proposed a method to solve the limited corpus problem in their research [3], which involves using reinforcement learning algorithms in the formation of selection trees. This algorithm associates the given seed phrase with the input of the tree LSTM model during sentence generation, effectively addressing the challenges of limited corpus.…”
Section: Related Researchmentioning
confidence: 99%
“…In recent years, Tree-LSTM have been widely applied to many fields of NLP, such as text generation, neural machine translation, sentiment analysis, sentence semantic modeling, and event extraction tasks [6][7][8][9] . Compared with the sequence structure, the tree-structured neural network is a better alternative for extracting text information [10][11][12] .…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, Tree-LSTM has been widely applied to many fields of NLP, such as text generation, neural machine translation, sentiment analysis, sentence semantic modelling, and event extraction tasks [ 12 – 15 ]. Compared with an sequence structure, a tree-structured neural network is a better alternative for extracting text information [ 16 , 17 ]. In tree-structured neural network, words contribute unevenly to building a syntactic dependency tree, and it will bring noise and affect model training when all hidden states of the child nodes are transferred to the parent nodes.…”
Section: Related Workmentioning
confidence: 99%