Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval 2022
DOI: 10.1145/3477495.3531953
|View full text |Cite
|
Sign up to set email alerts
|

Constructing Better Evaluation Metrics by Incorporating the Anchoring Effect into the User Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 34 publications
0
4
0
Order By: Relevance
“…They adopted users' annotations to represent the perceptions and found that the real-time experience affected users' perceptions. These effects can be explained by theories of anchoring bias, reference-dependence bias, and expectation confirmation, which support that people usually refer to a reference point when evaluating current situation (Chen et al, 2022;Hossain & Quaddus, 2012;Liu & Han, 2020;Shokouhi et al, 2015;Thomas et al, 2022). The reference-dependence effect is a key component of prospect theory, explaining the cognitive bias that occurs when users evaluate current outcomes based on a previous reference point (Liu & Han, 2020;Tversky & Kahneman, 1992).…”
Section: Introductionmentioning
confidence: 89%
“…They adopted users' annotations to represent the perceptions and found that the real-time experience affected users' perceptions. These effects can be explained by theories of anchoring bias, reference-dependence bias, and expectation confirmation, which support that people usually refer to a reference point when evaluating current situation (Chen et al, 2022;Hossain & Quaddus, 2012;Liu & Han, 2020;Shokouhi et al, 2015;Thomas et al, 2022). The reference-dependence effect is a key component of prospect theory, explaining the cognitive bias that occurs when users evaluate current outcomes based on a previous reference point (Liu & Han, 2020;Tversky & Kahneman, 1992).…”
Section: Introductionmentioning
confidence: 89%
“…Thus, their expectations create cognitive bias and lead to biased results in user modeling and evaluation. Previous studies have been engaged in modeling users' behaviors under influence of their expectations (N. Chen, Zhang, & Sakai, 2022; Moffat, Bailey, Scholer, & Thomas, 2017). However, there is little direct evidence showing relationships between users' expectations and actual search behaviors.…”
Section: Introductionmentioning
confidence: 99%
“…As described in Section 2.1, most offline evaluation metrics treat users as globally rational decision makers when simulating interactions with search engines, but this assumption has been increasingly challenged recently. In recent years, with the growing knowledge about users' cognitive biases, some works in the field of IR system evaluation began to introduce cognitive biases into the construction and meta-evaluation of evaluation metrics [14,17,107]. However, there is few work that incorporates the decoy effect into the calculation of IR system evaluation metrics.…”
Section: The Evaluation Of Information Retrieval Systemsmentioning
confidence: 99%
“…Therefore, there is a need to develop new evaluation metrics to more accurately measure system performance and ensure they reflect users' actual experiences and preferences. Although some recent efforts have incorporated cognitive biases into the calculation of IR system evaluation metrics [14,17,107], they have not considered the decoy effect. The introduction of DEJA-VU expands the work on IR system evaluation by considering the vulerability of IR systems to the decoy effect.…”
Section: Main Findings and Implicationsmentioning
confidence: 99%
See 1 more Smart Citation