2012
DOI: 10.1007/978-3-642-34487-9_40
|View full text |Cite
|
Sign up to set email alerts
|

A Contextual-Bandit Algorithm for Mobile Context-Aware Recommender System

Abstract: Most existing approaches in Context-Aware Recommender Systems (CRS) focus on recommending relevant items to users taking into account contextual information, such as time, location, or social aspects. However, few of them have considered the problem of user's content dynamicity. We introduce in this paper an algorithm that tackles the user's content dynamicity by modeling the CRS as a contextual bandit algorithm and by including a situation clustering algorithm to improve the precision of the CRS. Within a del… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
64
0
3

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 89 publications
(69 citation statements)
references
References 21 publications
(25 reference statements)
0
64
0
3
Order By: Relevance
“…Existing studies propose different contextual bandit algorithms and show their empirical performances on various real-world data sets [5,6,13]. It is known that a bandit algorithm with different parameter settings can have different performance [22].…”
Section: Problem Formulationmentioning
confidence: 99%
“…Existing studies propose different contextual bandit algorithms and show their empirical performances on various real-world data sets [5,6,13]. It is known that a bandit algorithm with different parameter settings can have different performance [22].…”
Section: Problem Formulationmentioning
confidence: 99%
“…To recommend items to users without sufficient historical records, several studies formulate this task as a multi-armed bandit problem [14,15,5]. Multi-armed 1Õ (·) is variant of big O notation that ignores logarithmic factors.…”
Section: Related Workmentioning
confidence: 99%
“…Here the decision maker's goal is to choose the arms that will maximize the expected reward given the observed contexts. Contextual bandits have been used in systems for recommending personalized content to users (8,9).…”
Section: Methods Contextual Multiarmed Bandits With Similarity Informamentioning
confidence: 99%