2023
DOI: 10.1109/tits.2023.3279412
|View full text |Cite
|
Sign up to set email alerts
|

Accurately Predicting Quality of Services in IoT via Using Self-Attention Representation and Deep Factorization Machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 40 publications
0
0
0
Order By: Relevance
“…The attention mechanism (Vaswani et al, 2017) is a mechanism that extracts important information from a large amount of information by simulating the way the human brain processes information overload, which relies less on external information and focuses more on capturing the internal relevance of data or features. The use of the Self-Attentive Mechanism to dynamically adjust the weights of different features of the service data can enhance the extraction of features with high contribution to the service recommendation results and make the model more focused on the classification of features with high contribution (Tang et al, 2023).…”
Section: Multi-head Self-attention Representationmentioning
confidence: 99%
“…The attention mechanism (Vaswani et al, 2017) is a mechanism that extracts important information from a large amount of information by simulating the way the human brain processes information overload, which relies less on external information and focuses more on capturing the internal relevance of data or features. The use of the Self-Attentive Mechanism to dynamically adjust the weights of different features of the service data can enhance the extraction of features with high contribution to the service recommendation results and make the model more focused on the classification of features with high contribution (Tang et al, 2023).…”
Section: Multi-head Self-attention Representationmentioning
confidence: 99%