Abstract:This study evaluates the data sources and research methods used in earlier studies to rank the research productivity of Library and Information Science (LIS) faculty and schools. In doing so, the study identifies both tools and methods that generate more accurate publication count rankings as well as databases that should be taken into consideration when conducting comprehen-
“…More recent studies suggest that using SSCI publication and citation data leads to a skewed impression of LIS faculty productivity. Meho & Spurgin (2005, p. 1327 suggest that certain subspecializations (archives, digital libraries, and school media/children's literature, to name but three) are less likely to be indexed by SSCI, and consequently, certain LIS researchers are less likely to appear on this list. If used to establish benchmarks, this list must be used cautiously.…”
One aspect of faculty effectiveness can be measured through research productivity, and publication and citation rates can serve as an indicator of that productivity. This study, the fourth in a series to examine LIS faculty and program productivity as measured by publication and citation, uses the same methodology as the previous investigations. A consistent data instrument (the Social Science Citation Index) provided publication and citation data for LIS faculty, covering the years 1999 to 2004. Tables show the faculty and programs with the highest publication and citation rates, both overall and per capita, as well as a cumulative ranking of LIS programs based on faculty research productivity. This study, in conjunction with the three previous, documents an increase in LIS research productivity, suggesting an increase in faculty effectiveness.
“…More recent studies suggest that using SSCI publication and citation data leads to a skewed impression of LIS faculty productivity. Meho & Spurgin (2005, p. 1327 suggest that certain subspecializations (archives, digital libraries, and school media/children's literature, to name but three) are less likely to be indexed by SSCI, and consequently, certain LIS researchers are less likely to appear on this list. If used to establish benchmarks, this list must be used cautiously.…”
One aspect of faculty effectiveness can be measured through research productivity, and publication and citation rates can serve as an indicator of that productivity. This study, the fourth in a series to examine LIS faculty and program productivity as measured by publication and citation, uses the same methodology as the previous investigations. A consistent data instrument (the Social Science Citation Index) provided publication and citation data for LIS faculty, covering the years 1999 to 2004. Tables show the faculty and programs with the highest publication and citation rates, both overall and per capita, as well as a cumulative ranking of LIS programs based on faculty research productivity. This study, in conjunction with the three previous, documents an increase in LIS research productivity, suggesting an increase in faculty effectiveness.
“…Publication productivity could be a good indicator of research output and used to rank countries, research institutes, or researchers in different fields (Liu & Cheng, 2005;Meho & Spurgin, 2005;Narin & Hamilton, 1996;Toutkoushian et al, 2003;Yazit & Zainab, 2007). The impact of a publication is assessed in terms of the number of citations that it has received in relation to other outputs in the journal (Yi et al, 2008).…”
Section: Discussionmentioning
confidence: 99%
“…Publication count is an indicator of research productivity and used to rank countries and universities (Liu & Cheng, 2005;Meho & Spurgin, 2005;Narin & Hamilton, 1996;Toutkoushian et al, 2003;Yazit & Zainab, 2007). It can also be used to determine authors' productivity or the publication productivity of research groups (Liu & Cheng, 2005;Hart, 2000;Uzun, 2002;Gu & Zainab, 2001;Fox, 1983).…”
Publication productivity, as measured by the number of papers, has been regarded as one of the main indicators of reputation of countries and institutions. Nevertheless, the relationship among research publications, economic growth and World Wide Web in ASEAN countries is still unclear. The main intention of this study was to identify publication productivity among ASEAN and the world's top ten countries in the last 16 years (1996)(1997)(1998)(1999)(2000)(2001)(2002)(2003)(2004)(2005)(2006)(2007)(2008)(2009)(2010)(2011). This study also aimed at finding the relationship among publication, gross domestic product (GDP) and internet usage. Furthermore, the publication trend in the 10 first Malaysian universities was evaluated for the same periods. Scopus database was used to find the overall documents, overall citations, citations per document and international collaboration from 1996 to 2011 for each country. The World Bank database (World Data Bank) was used to collect the data for GDP and the number of internet users. Moreover, to evaluate 10 top Malaysian universities, the number of published articles, conferences, reviews, and letters for the same periods was collected. The results of this study showed significant differences among ASEAN and top 10 countries regarding publication productivity. Moreover, a positive and significant relationship was observed between indices, GDP and internet usage for these countries. Surprisingly, international collaboration had a significant and negative relationship with economic growth. Malaysia had fewer citations per document (7.64) and international collaboration (36.9%) among ASEAN countries. In conclusion, international collaboration between academic institutes and researchers is influenced by economic growth and access to internet in the countries. Furthermore, publication trends in ASEAN countries are promising. However, policy makers and science managers should try to find different ways to increase the quality of the research publication and to raise citation per document.
“…We consulted: Library Literature & Information Science, INSPEC, Social Sciences Citation Index, and Inside Conferences, the sources Meho and Spurgin (2005, pp. 1328-1329 identified as Athe four periodical databases that provide the most comprehensive coverage of the periodical literature."…”
A sample of 1,483 publications, representative of the scholarly production of LIS faculty, was searched in Web of Science (WoS), Google, and Google Scholar. The median number of citations found through WoS was zero for all types of publications except book chapters; the median for Google Scholar ranged from 1 for print/subscription journal articles to 3 for books and book chapters. For Google the median number of citations ranged from 9 for conference papers to 41 for books. A sample of the web citations was examined and classified as representing intellectual or non-intellectual impact. Almost 92% of the citations identified through Google Scholar represented intellectual impact-primarily citations from journal articles. Bibliographic services (non-intellectual impact) were the largest single contributor of citations identified through Google. Open access journal articles attracted more web citations but the citations to print/subscription journal articles more often represented intellectual impact. In spite of problems with Google Scholar, it has the potential to provide useful data for research evaluation, especially in a field where rapid and fine-grained analysis is desirable.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.