Proceedings of the 25th International Conference on Machine Learning - ICML '08 2008
DOI: 10.1145/1390156.1390252
|View full text |Cite
|
Sign up to set email alerts
|

Learning to learn implicit queries from gaze patterns

Abstract: In the absence of explicit queries, an alternative is to try to infer users' interests from implicit feedback signals, such as clickstreams or eye tracking. The interests, formulated as an implicit query, can then be used in further searches. We formulate this task as a probabilistic model, which can be interpreted as a kind of transfer or meta-learning. The probabilistic model is demonstrated to outperform an earlier kernel-based method in a small-scale information retrieval task.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…However, important information regarding the severity of the patient's presentation remains restricted to free text. Table 2 Free-text and coded data missing from the feature set Data in free-text form Chronic anemia (27) Cured cancer as a chronic diagnosis (6)* y Cirrhosis (2) Advanced dementia/bedridden (12)* y Non-acute presentation (5)y Chronic infection (2) Known chronic kidney disease (11)* Active cancer (4) History of bleeding (2) Rectal exam (11) Active bleeding (3) Myelodysplastic syndrome (1) Previous endoscopy (10) History of cancer (3) Metrorharrgia (1) Gastrointestinal disease (7) Coomb's test/blood bank (2)z Recent procedure (1) Heart disease (7)y Anticoagulation (2) Coded data fields Diagnosis of general deterioration Diagnosis of hematuria Urine M spike Diagnosis of G6PD deficiency Treatment with carbamezapine* Numbers in parentheses denote the number of cases data were noted in. *Free-text data associated with a change in recommended action (observed only for endoscopy and bone marrow examinations).…”
Section: Discussionmentioning
confidence: 99%
“…However, important information regarding the severity of the patient's presentation remains restricted to free text. Table 2 Free-text and coded data missing from the feature set Data in free-text form Chronic anemia (27) Cured cancer as a chronic diagnosis (6)* y Cirrhosis (2) Advanced dementia/bedridden (12)* y Non-acute presentation (5)y Chronic infection (2) Known chronic kidney disease (11)* Active cancer (4) History of bleeding (2) Rectal exam (11) Active bleeding (3) Myelodysplastic syndrome (1) Previous endoscopy (10) History of cancer (3) Metrorharrgia (1) Gastrointestinal disease (7) Coomb's test/blood bank (2)z Recent procedure (1) Heart disease (7)y Anticoagulation (2) Coded data fields Diagnosis of general deterioration Diagnosis of hematuria Urine M spike Diagnosis of G6PD deficiency Treatment with carbamezapine* Numbers in parentheses denote the number of cases data were noted in. *Free-text data associated with a change in recommended action (observed only for endoscopy and bone marrow examinations).…”
Section: Discussionmentioning
confidence: 99%
“…In this work, we investigate other modalities of evidence, which can be used to determine the factuality of a claim. Specifically, we consider whether data from eye-tracking can be used to infer the factuality of a claim, as eyetracking has previously been used in information retrieval to infer relevance [1,14,15,46,59,70]. Additionally, eye-tracking has been used to investigate how users engage with news content, where it has been observed that users tend to read false news faster [26], as well as putting more visual attention on credible news posts [78].…”
Section: Chapter 6: Factuality Checking In News Headlines With Eye Trackingmentioning
confidence: 99%
“…These features are then used for statistical inferences, classification, and prediction. For instance, some variants of aggregated fixation-count and fixation-duration were used in studies reported in [14,15,19,21,39,41,59,63]. Eye-dwell time and/or visit time was used by Fahey et al [14].…”
Section: Related Work 21 Information Relevance and Eye-trackingmentioning
confidence: 99%