Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval 2005
DOI: 10.1145/1076034.1076062
|View full text |Cite
|
Sign up to set email alerts
|

Combining eye movements and collaborative filtering for proactive information retrieval

Abstract: We study a new task, proactive information retrieval by combining implicit relevance feedback and collaborative filtering. We have constructed a controlled experimental setting, a prototype application, in which the users try to find interesting scientific articles by browsing their titles. Implicit feedback is inferred from eye movement signals, with discriminative hidden Markov models estimated from existing data in which explicit relevance feedback is available. Collaborative filtering is carried out using … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
40
0

Year Published

2005
2005
2019
2019

Publication Types

Select...
4
2
2

Relationship

4
4

Authors

Journals

citations
Cited by 54 publications
(40 citation statements)
references
References 21 publications
0
40
0
Order By: Relevance
“…We can have only guess and estimate the semantic by using emotional reactions of a user when s(he) has shown interest on multimedia contents during retrieval process. Semantic can be guessed and estimated by capturing and interpreting a stream of user's actions (browse and navigation, eye tracking, etc) during interaction with multimedia contents [7,5]. It is suggested that relevance feedback can be utilized for affective retrieval [8] but still require an efficient affective feedback system that can capture, register and interpret up to some extents spontaneous reactions of a user and can use it to identify user's current context amongst multiple contexts present in a user's search query.…”
Section: Problem Formulationmentioning
confidence: 99%
See 2 more Smart Citations
“…We can have only guess and estimate the semantic by using emotional reactions of a user when s(he) has shown interest on multimedia contents during retrieval process. Semantic can be guessed and estimated by capturing and interpreting a stream of user's actions (browse and navigation, eye tracking, etc) during interaction with multimedia contents [7,5]. It is suggested that relevance feedback can be utilized for affective retrieval [8] but still require an efficient affective feedback system that can capture, register and interpret up to some extents spontaneous reactions of a user and can use it to identify user's current context amongst multiple contexts present in a user's search query.…”
Section: Problem Formulationmentioning
confidence: 99%
“…Fig. 2, yields the final discriminating function and is used in training and recognizing the spoken emotional word.Several attempts have beenintroduced in thepast to build user's profileand learning user interest from implicit feedback [35,5].…”
Section: Spontaneous Emotional Spoken Word Recognition Systemmentioning
confidence: 99%
See 1 more Smart Citation
“…Eye movements however can also be treated as an implicit feedback when the user is not consciously trying to influence the interface by where they focus their attention. Eye movements as implicit feedback has recently been considered in the text retrieval setting [11,4,1]. To the best of our knowledge however, at the time of writing, only [10,8] used eye movements for image retrieval.…”
Section: Introductionmentioning
confidence: 99%
“…More robust methods of interpreting the data are needed. There has been some recent work on document retrieval in which eye tracking data has been used to refine the accuracy of relevance predictions [13].…”
Section: Introductionmentioning
confidence: 99%