2015
DOI: 10.1097/acm.0000000000000902
|View full text |Cite
|
Sign up to set email alerts
|

Seeing Things Differently or Seeing Different Things? Exploring Raters’ Associations of Noncognitive Attributes

Abstract: The MFRM and hierarchical clustering helped to explain some of the variability associated with raters in a way that other measurement models are unable to capture. These findings highlight that differences in ratings may result from raters possessing different interpretations of an observed performance. This study has implications for developing more purposeful rater selection and rater profiling in performance-based assessments.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 19 publications
(18 reference statements)
0
9
0
Order By: Relevance
“…Table 1 shows 49 articles reported evidence to support the content of MMI, while Table 2 shows that 40 articles support the internal structure2, 4, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47; 37 articles support the response process2, 5, 6, 11, 12, 14, 18, 19, 20, 23, 25, 26, 27, 28, 30, 31, 33, 37, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62; 21 articles support the relation to other variables,4, 15, 16, 17, 21, 24, 25, 29, 38, 39, 40, 44,…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Table 1 shows 49 articles reported evidence to support the content of MMI, while Table 2 shows that 40 articles support the internal structure2, 4, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47; 37 articles support the response process2, 5, 6, 11, 12, 14, 18, 19, 20, 23, 25, 26, 27, 28, 30, 31, 33, 37, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62; 21 articles support the relation to other variables,4, 15, 16, 17, 21, 24, 25, 29, 38, 39, 40, 44,…”
Section: Resultsmentioning
confidence: 99%
“…Both applicants and examiners were positive about the experience and potential of MMI as a student selection method2, 6, 14, 26, 27, 28, 33, 37, 46, 47, 48, 49, 51, 52, 53, 55, 56, 60, 61; MMI is free of gender, age, previous experience, prior knowledge, and cultural bias14, 19, 23, 31, 48, 54, 57, 60; MMI is a fair assessment and scoring sheet which allowed them to differentiate between applicants5, 18, 19, 20, 47, 48, 51, 52, 58, 60; neither aboriginal-specific rater training nor aboriginal rater assignment is required 11 ; violations of MMI security do not unduly influence applicant performance ratings 12 ; MMI provides sufficient time for students to present ideas 48 ; MMI is at least as cost-efficient as many other personal interview formats, 50 MMI eases interviewer anxiety associated with having to judge candidates unfavourably 51 ; and MMI was not stressful 27 . Conversely, MMI requires a greater number of rooms 50 ; station scores provided by student interviewers were slightly higher than those of faculty member or practitioner interviewers 25 ; student interviewers were less lenient26, 30 and had more unexpected ratings 30 ; students preferred a mixed format, rather than MMI alone 59 ; cultural specificity of some stations and English-language proficiency were seen to disadvantage international students 37 ; applicants with introverted personalities may fare less well in the MMI process 62 ; and raters were unable to distinguish between the various non-cognitive attributes 45 . Overall, MMI was consistently judged to be more favourable than unfavourable by both candidates and examiners (Table 2).…”
Section: Resultsmentioning
confidence: 99%
“…The findings of high correlations between the different dimensions rated in the panel interview and high loadings on the one factor in the CFA and principal components analysis highlight the difficulty panel interviewers have in distinguishing between dimensions [ 37 ].…”
Section: Discussionmentioning
confidence: 99%
“…Several conceptual models of rater cognition have emerged, and different authors have targeted different components of the assessment process. Within the rater cognition literature, rater‐based assessment has been examined through the lenses of first impressions, the attentional and cognitive limits of raters, the cognitive processes of rating, the ‘codes’ that raters use to transfer meaning, the social nature of assessment decisions, and the role of the immediate rating context . Regardless of the lens used, there is a general consensus that rater‐based assessment is a complex process that can be influenced by a multitude of factors.…”
Section: Rater Cognitionmentioning
confidence: 99%