2022
DOI: 10.1155/2022/1775496
|View full text |Cite
|
Sign up to set email alerts
|

TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism

Abstract: The click-through rate (CTR) prediction task is used to estimate the probabilities of users clicking on recommended items, which are extremely important in recommender systems. Recently, the deep factorization machine (DeepFM) algorithm was proposed. The DeepFM algorithm incorporates a factorization machine (FM) to learn not only low-order features but also the interactions of higher-order features. However, DeepFM lacks user diversity representations and does not consider the text. In view of this, we propose… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 40 publications
0
0
0
Order By: Relevance
“…The semantic enhanced representations for two sentences are obtained through the attention mechanism and character splicing. Text-attention Factorization Mechanism (TAFM; Zhang et al, 2022 ) can extract features through text components, text attention, and N-gram text features, mine potential user preferences, and then it uses convolutional automatic encoder to learn higher-level features. There are currently few textual attention models, and there are relatively few works that apply textual attention models to network structure modeling.…”
Section: Related Workmentioning
confidence: 99%
“…The semantic enhanced representations for two sentences are obtained through the attention mechanism and character splicing. Text-attention Factorization Mechanism (TAFM; Zhang et al, 2022 ) can extract features through text components, text attention, and N-gram text features, mine potential user preferences, and then it uses convolutional automatic encoder to learn higher-level features. There are currently few textual attention models, and there are relatively few works that apply textual attention models to network structure modeling.…”
Section: Related Workmentioning
confidence: 99%