2010
DOI: 10.3152/095820210x12809191250889
|View full text |Cite
|
Sign up to set email alerts
|

The controversial policies of journal ratings: evaluating social sciences and humanities

Abstract: International audienceIn a growing number of countries, governments and public agencies seek to systematically assess the scientific outputs of their universities and research institutions. Bibliometrics indicators and peer review are regularly used for this purpose, and their advantages and biases are discussed in a wide range of literature. This article examines how three different national organisations (AERES, ERA, ERIH) produce journal ratings as an alternative assessment tool, which is particularly targe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
35
0
2

Year Published

2011
2011
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 76 publications
(37 citation statements)
references
References 50 publications
(32 reference statements)
0
35
0
2
Order By: Relevance
“…As a surrogate for 'quality', universities, business schools, and departments are turning increasingly towards the use of journal ranking lists most of which are either constructed using citation analysis or informed by it. This situation represents a major change for the humanities and social sciences from the previously dominant system of peer review but there are clear exceptions, such as the Australian ERA journal ranking list which was subject to major consultation with representatives of the social sciences research community prior to its finalisation (Pontille & Torny, 2010). However, such adjustment processes are very much a minor part of the majority of such exercises and even that particular rankings list was abandoned by the body that created it which, in a clear instance of the application of capture theory, elected instead to use a rankings list established by the community which would apply it, the Australian Business Deans Council.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…As a surrogate for 'quality', universities, business schools, and departments are turning increasingly towards the use of journal ranking lists most of which are either constructed using citation analysis or informed by it. This situation represents a major change for the humanities and social sciences from the previously dominant system of peer review but there are clear exceptions, such as the Australian ERA journal ranking list which was subject to major consultation with representatives of the social sciences research community prior to its finalisation (Pontille & Torny, 2010). However, such adjustment processes are very much a minor part of the majority of such exercises and even that particular rankings list was abandoned by the body that created it which, in a clear instance of the application of capture theory, elected instead to use a rankings list established by the community which would apply it, the Australian Business Deans Council.…”
Section: Introductionmentioning
confidence: 99%
“…Use of the journal rankings generated by such ventures has gone beyond their use by universities in assessing the research of their staff. In a number of countries, they are being used by governments to inform decisions concerning university funding (Pontille & Torny, 2010), in some cases using journal rankings based entirely on citations, such as the Thomson Reuters SSCI 1 and SCOPUS SCImago 2 . It is therefore of importance to faculty that they are aware of the level of citations their publications might be expected to receive in a particular outlet.…”
Section: Introductionmentioning
confidence: 99%
“…There exists considerable debate about whether such assessments should be based on bibliometrics (typically citation scores), peer review, and/or (non-academic) impact (Brinn, Jones, & Pendlebury, 2000;Butler & McAllister, 2009;Campanario, 1998;Moed, Luwel, & Nederhof, 2002;Pontille & Torny, 2010). Bibliometrics may be seen as objective and relatively simple to produce whereas peer review is subjective and takes longer to perform.…”
Section: Research On Journal Rankingsmentioning
confidence: 99%
“…Donovan & Butler 2007, Cunningham 2008, Linmans 2010, Pontille & Torny 2010, Giménez-Toledo et al 2013. Still, we must consider some other problems that concern the epistemological status of ratings and rankings.…”
mentioning
confidence: 99%