2022
DOI: 10.1080/09540091.2022.2134843
|View full text |Cite
|
Sign up to set email alerts
|

Research on the Uyghur morphological segmentation model with an attention mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…We introduce an attention mechanism for learning the relative importance of various words to increase the influence of crucial sentiment words on the overall sentiment expression. The use of an attention machine enhances the performance of the model by allowing for more critical information ( Abudouwaili et al, 2022 ). In this experiment, weights for emphasizing keywords are assigned using a feed-forward neural attention strategy.…”
Section: Methodsmentioning
confidence: 99%
“…We introduce an attention mechanism for learning the relative importance of various words to increase the influence of crucial sentiment words on the overall sentiment expression. The use of an attention machine enhances the performance of the model by allowing for more critical information ( Abudouwaili et al, 2022 ). In this experiment, weights for emphasizing keywords are assigned using a feed-forward neural attention strategy.…”
Section: Methodsmentioning
confidence: 99%
“…In summary, although supervised models for Uyghur and Kazakh lexical analysis have achieved certain research progress [6,[40][41][42], these models' evaluation metrics are based on character-level metrics. The performance and differences of these models have not been explored using morpheme-level evaluation metrics.…”
Section: Related Workmentioning
confidence: 99%
“…In agglutinative languages, the first morpheme is the stem without prefixes. Furthermore, when the morphology of Uyghur and Kazakh is concatenated, phonological harmony may cause characters at the junction to occur: phenomena such as deletion, addition, and weakening [6]. Figure 1 shows three different phonological changes.…”
Section: Introductionmentioning
confidence: 99%