2012
DOI: 10.1177/0018726712467048
|View full text |Cite
|
Sign up to set email alerts
|

Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists

Abstract: The paper critically examines how work is shaped by performance measures. Its specific focus is upon the use of journal lists, rather than the detail of their construction, in conditioning the research activity of academics I logic of journal lists is to endorse and cultivate a research monoculture in which particular criteria, favoured by a given list, assume the status of a universal benchmark of performance ( research quality ). The paper demonstrates, with reference to the Association of Business Schools (… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

7
212
0
7

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 204 publications
(237 citation statements)
references
References 74 publications
(105 reference statements)
7
212
0
7
Order By: Relevance
“…The adverse impact of this "audit culture" is well documented (see e.g. Adler & Harzing, 2009;Mingers & Willmott, 2013). However, since the reversal of this trend is unlikely, research into fairer and more inclusive ways of measuring research performance is gaining more and more momentum.…”
Section: Introductionmentioning
confidence: 99%
“…The adverse impact of this "audit culture" is well documented (see e.g. Adler & Harzing, 2009;Mingers & Willmott, 2013). However, since the reversal of this trend is unlikely, research into fairer and more inclusive ways of measuring research performance is gaining more and more momentum.…”
Section: Introductionmentioning
confidence: 99%
“…In comprehensive journal ranking studies that include journals from various disciplines (e.g., general business journal rankings), the "one-size-fits-all" ranking approach fails the specialist journals (Milne, 2000;Mingers & Willmott, 2013). The key issue with these types of rankings stems from a smaller size of the research community that works in a highly specialized, niche area.…”
Section: Academic Journal Rankingsmentioning
confidence: 99%
“…The situation concerning the perceived benefits of journal ranking lists is somewhat clearer. As articulated by Mingers & Willmott (2013), those who use journal ranking lists managerially believe that compared to reading and reflecting upon a piece of research, journal ranking lists are a superior means of evaluating research. They have been claimed to be more objective than personal judgments and to be accurate predictors of research assessment exercise outcomes, yet neither of these claims has ever actually been satisfactorily or conclusively demonstrated to be the case.…”
Section: Where Have We Got To Now?mentioning
confidence: 99%
“…They have been claimed to be more objective than personal judgments and to be accurate predictors of research assessment exercise outcomes, yet neither of these claims has ever actually been satisfactorily or conclusively demonstrated to be the case. Furthermore, the benefits claimed ignore the impact of the managerial use of journal ranking lists on research, on individuals, and on research communities: "the journal list has become a potent instrument of managerial decision-making whose use, we will argue, has the performative effect of homogenizing, in addition to commodifying and individualizing, research activity" (Mingers & Willmott, 2013, p. 1053.…”
Section: Where Have We Got To Now?mentioning
confidence: 99%