2019 10th International Conference on Information, Intelligence, Systems and Applications (IISA) 2019
DOI: 10.1109/iisa.2019.8900678
|View full text |Cite
|
Sign up to set email alerts
|

Improving Collaborative Filtering’s Rating Prediction Coverage in Sparse Datasets through the Introduction of Virtual Near Neighbors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(17 citation statements)
references
References 31 publications
0
17
0
Order By: Relevance
“…Moreover, we compared the ITRA model with other advanced approaches, namely, UCF, TAR, FTM, DLM and STPMF. In addition, compared to the prediction accuracy of previous synthetic algorithms applied in recommendation systems [45], [47], [55], the proposed model in our article is much more suitable for recommendations in terms of accuracy. That is to say, by mining user implicit trust, more efficient rating predictions can be achieved.…”
Section: Discussionmentioning
confidence: 89%
See 1 more Smart Citation
“…Moreover, we compared the ITRA model with other advanced approaches, namely, UCF, TAR, FTM, DLM and STPMF. In addition, compared to the prediction accuracy of previous synthetic algorithms applied in recommendation systems [45], [47], [55], the proposed model in our article is much more suitable for recommendations in terms of accuracy. That is to say, by mining user implicit trust, more efficient rating predictions can be achieved.…”
Section: Discussionmentioning
confidence: 89%
“…To have a better understanding of the prediction accuracy, our article conducts comparative experiments to compare the prediction accuracy of our ITRA model with those of other advanced algorithms on three datasets. In addition, to compare the experimental accuracy, the target user's nearest neighbors (NN) can be set from small to large: 5,15,25,35,45,55,65,75, 85 and 95. The prediction results are shown as follows.…”
Section: Resultsmentioning
confidence: 99%
“…The CF negNNs algorithm presented in [26] increases the CF rating prediction coverage in sparse datasets, by incorporating, in the rating prediction computation process, users with negative similarity to the user for whom the rating prediction is computed. Margaris et al [27] propose the CF VNN algorithm, which creates virtual user profiles, termed virtual NNs (VNNs), by merging pairs of NN profiles corresponding to real users having high similarity. The introduction of the VNN profiles contributes to the alleviation of the "grey sheep" problem.…”
Section: Related Workmentioning
confidence: 99%
“…The CF VNN algorithm [27], where artificial user profiles (virtual near neighbours-VNNs) are created by combining pairs of real users which are NNs of high similarity. The CF VNN algorithm was configured to use the optimal parameters for its operation reported in [27], i.e., Th(sim) = 1.0 and Th(cr) = 1. For more details on the parameters, the interested reader is referred to [27].…”
Section: Prediction Coverage Increasementioning
confidence: 99%
See 1 more Smart Citation