2009
DOI: 10.1002/mar.20283
|View full text |Cite
|
Sign up to set email alerts
|

Do experts and novices evaluate movies the same way?

Abstract: Do experts and novices evaluate creativity the same way? This question is particularly relevant to the study of critical and public response to movies. How do public opinions differ from movie critic opinions? This study assessed college student (i.e., novice) ratings on movies released from 2001 to 2005 and compared them to expert opinions and those of self-declared novices on major movie rating Web sites. Results suggest that the student ratings overlapped considerably-but not overwhelmingly-with the self-de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
60
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 91 publications
(63 citation statements)
references
References 28 publications
0
60
0
Order By: Relevance
“…Similar differences were found when creativity ratings of expert movie critics (extracted from a database of compiled published reviews) were compared with individuals who regularly reviewed movies on-line via boxofficemom.com and imdb.com, and with 169 university students (ethnically diverse; academic major unknown) [42]. Again, professional critics were found to be more severe raters (tended to rate movies more stringently); reported correlations between students and experts' ratings were only moderate (r = 0.43, p < 0.01).…”
Section: The Consensual Assessment Techniquementioning
confidence: 62%
See 1 more Smart Citation
“…Similar differences were found when creativity ratings of expert movie critics (extracted from a database of compiled published reviews) were compared with individuals who regularly reviewed movies on-line via boxofficemom.com and imdb.com, and with 169 university students (ethnically diverse; academic major unknown) [42]. Again, professional critics were found to be more severe raters (tended to rate movies more stringently); reported correlations between students and experts' ratings were only moderate (r = 0.43, p < 0.01).…”
Section: The Consensual Assessment Techniquementioning
confidence: 62%
“…While conceivable that novice judges studying art-who might have ostensibly scored as knowledgeable novices in visual/visual-spatial productions [42,53]-might score with greater severity (i.e., more like "experts"), as described by Kaufman and colleagues [19], they did not. And although novice judges in psychology seemed to score with slightly more reliability and consistency-the differences between rater groups remain inconclusive.…”
Section: Discussionmentioning
confidence: 98%
“…supported by the literature, which in prior studies indicated expert raters to exhibit the highest levels of interrater reliability (Plucker et al 2008(Plucker et al , 2009). However, non-expert evaluators tended to highly agree between raters, but it remains unclear as to what exactly they are evaluating and why.…”
Section: Resultsmentioning
confidence: 65%
“…While novice and expert correlations were similar to earlier investigations cited on novice vs. expert ratings, results of the quasiexpert group bridged the gap with significant correlations with both novice (r = .65) and experts (r = .72). Results indicate a dichotomous classification of raters does not support the full range of options available to researchers for creativity evaluation (Plucker et al 2008(Plucker et al , 2009. Therefore the use of gifted novices or quasi-experts is supported in conjunction with, or as a suitable substitute for, expert ratings of creative products in technical and non-technical domains (Baer et al 2004Kaufman et al 2005).…”
Section: Expert Vs Non-expert Ratersmentioning
confidence: 90%
See 1 more Smart Citation