In the past, several works have investigated ways for combining quantitative and qualitative methods in research assessment exercises. Indeed, the Italian National Scientific Qualification (NSQ), i.e. the national assessment exercise which aims at deciding whether a scholar can apply to professorial academic positions as Associate Professor and Full Professor, adopts a quantitative and qualitative evaluation process: it makes use of bibliometrics followed by a peer-review process of candidates’ CVs. The NSQ divides academic disciplines into two categories, i.e. citation-based disciplines (CDs) and non-citation-based disciplines (NDs), a division that affects the metrics used for assessing the candidates of that discipline in the first part of the process, which is based on bibliometrics. In this work, we aim at exploring whether citation-based metrics, calculated only considering open bibliographic and citation data, can support the human peer-review of NDs and yield insights on how it is conducted. To understand if and what citation-based (and, possibly, other) metrics provide relevant information, we created a series of machine learning models to replicate the decisions of the NSQ committees. As one of the main outcomes of our study, we noticed that the strength of the citational relationship between the candidate and the commission in charge of assessing his/her CV seems to play a role in the peer-review phase of the NSQ of NDs.
The importance of open bibliographic repositories is widely accepted by the scientific community.For evaluation processes, however, there is still some skepticism: even if large repositories of open access articles and free publication indexes exist and are continuously growing, assessment procedures still rely on proprietary databases in many countries. This is mainly due to the richness of the data available in these proprietary databases and the services provided by the companies they are offered by. This paper investigates the current status of open bibliographic data, in particular of three of the most used open resources, namely Microsoft Academic Graph, Crossref and OpenAIRE, evaluating their potentialities as substitutes of proprietary databases for academic evaluation processes. In particular, we focused on the Italian National Scientific Qualification (NSQ), the National process for University Professor qualification in Italy, which uses data from commercial indexes, and investigated similarities and differences between research areas, disciplines and application roles. The main conclusion is that open datasets are ready to be used for some disciplines, even if there is still room for improvement, but there is still a large gap to fill in others and a stronger effort is required from researchers and institutions.
The importance of open bibliographic repositories is widely accepted by the scientific community. For evaluation processes, however, there is still some skepticism: even if large repositories of open access articles and free publication indexes exist and are continuously growing, assessment procedures still rely on proprietary databases, mainly due to the richness of the data available in these proprietary databases and the services provided by the companies they are offered by. This paper investigates the status of open bibliographic data of three of the most used open resources, namely Microsoft Academic Graph, Crossref and OpenAIRE, evaluating their potentialities as substitutes of proprietary databases for academic evaluation processes. We focused on the Italian National Scientific Qualification (NSQ), the Italian process for University Professor qualification, which uses data from commercial indexes, and investigated similarities and differences between research areas, disciplines and application roles. The main conclusion is that open datasets are ready to be used for some disciplines, among which mathematics, natural sciences, economics and statistics, even if there is still room for improvement; but there is still a large gap to fill in others - like history, philosophy, pedagogy and psychology - and a stronger effort is required from researchers and institutions.
Peer Review
https://publons.com/publon/10.1162/qss_a_00203
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.